class definative commands out of sync solution - php

After about a week search I seem to find no conclusive / efficient / upto date / standard way of doing this.
The obvious solution is to run a query get the result into a variable and clear down ready for the next query - for efficiency NOT closing the connection.
This is being done in a wrapper class which has select / update / insert which in turn call a query or queries (for mysqli multi_query() ) so ideally it would be good to clear any result prior to next within query / queries to ensure it runs with no concern of what has gone before.
This does not seem possible as some solutions need a result set.
Currently the select / update / insert call:
private function queryclearup($dataresource) {
//clear up if required
switch ($this->dbtype) {
case "mysqli":
#mysqli_next_result($this->connection);
#mysqli_free_result($dataresource); //clear result to avoid out of sync issues
break;
}
}
after dealing with the result set - for example converting to a data array in the case of select, or just logging an error in the case of update / insert.
Still this problem persists with some (update), but not others (select / insert).
public function select($query, $args = array()) {
$this->loghelper->write("SQL Select\n" . $query);
$args = _PHP::array_defaults($args, array(
"pk" => null
));
$data_table = array();
$errors = null;
$query = utf8_decode($query);
switch ($this->dbtype) {
case "mysqli":
$dbdata = $this->query($query);
$data_table = $this->resourceToDataTable($dbdata, $args);
break;
}
return $data_table;
}
Then
private function resourceToDataTable($dataresource, $args = array()) {
$data_table = array();
if (gettype($dataresource) == "resource" || gettype($dataresource) == "object") {
do {
switch ($this->dbtype) {
case "mysqli":
$row = $dataresource->fetch_assoc();
break;
}
if ($row) {
array_push($data_table, $row);
}
} while ($row);
if (!empty($args["pk"])) {
$pktable = array();
foreach ($data_table as $d) {
if (!array_key_exists($args["pk"], $pktable)) {
$pktable[$d[$args["pk"]]] = $d;
}
}
$data_table = $pktable;
}
}
$this->queryclearup($dataresource);
return $data_table;
}

This appears to be whats required
private function queryclearup($dataresource) {
//clear up if required
switch ($this->db["contype"]) {
case "mysqli":
while ($dump = #mysqli_next_result($this->connection)) {
$donothing = 1;
}
#mysqli_free_result($dataresource); //clear result to avoid out of sync issues
break;
}
}

Related

Working with multiple rows from a MySQL query

Before I begin, I want to point out that I can solve my problem. I've rehearsed enough in PHP to be able to get a workaround to what I'm trying to do. However I want to make it modular; without going too much into detail to further confuse my problem, I will simplify what I am trying to do so that way it does not detract from the purpose of what I'm doing. Keep that in mind.
I am developing a simple CMS to manage a user database and edit their information. It features pagination (which works), and a button to the left that you click to open up a form to edit their information and submit it to the database (which also works).
What does not work is displaying each row from MySQL in a table using a very basic script which I won't get into too much detail on how it works. But it basically does a database query with this:
SELECT * FROM users OFFSET (insert offset here) LIMIT (insert limit here)
Essentially, with pagination, it tells what number to offset, and the limit is how many users to display per page. These are set, defined, and tested to be accurate and they do work. However, I am not too familiar how to handle these results.
Here is an example query on page 2 for this CMS:
SELECT * FROM users OFFSET 10 LIMIT 10
This should return 10 rows, 10 users down in the database. And it does, when I try this command in command prompt, it gives me what I need:
But when I try to handle this data in PHP like this:
<?php
while ($row = $db->query($pagination->get_content(), "row")) {
print_r($row);
}
?>
$db->query method is:
public function query($sql, $type = "assoc") {
$this->last_query = $sql;
$result = mysql_query($sql, $this->connection);
$this->confirm_query($result);
if ($type == "row") {
return mysql_fetch_row($result);
} elseif ($type == "assoc" || true) {
return mysql_fetch_assoc($result);
} elseif ($type == "array") {
return mysql_fetch_array($result);
} elseif ($type == false) {
return $result;
}
}
$pagination->get_content method is:
public function get_content() {
global $db;
$query = $this->base_sql;
if (!isset($_GET["page"])) {
$query .= " LIMIT {$this->default_limit}";
return $query;
} elseif (isset($_GET["page"]) && $_GET["page"] == 1) {
$query .= " LIMIT {$this->default_limit}";
return $query;
} elseif (isset($_GET["page"])) {
$query .= " LIMIT {$this->default_limit}";
$query .= " OFFSET " . (($_GET["page"] * $this->default_limit) - 10);
return $query;
}
}
And my results from the while loop (which should print out each row of the database, no?) gives me the same row everytime, continuously until PHP hits the memory limit/timeout limit.
Forgive me if its something simple. I rarely ever handle database data in this manner. What I want it to do is show the 10 users I requested. Feel free to ask any questions.
AFTER SOME COMMENTS, I'VE DECIDED TO SWITCH TO MYSQLI FUNCTIONS AND IT WORKS
// performs a query, does a number of actions dependant on $type
public function query($sql, $type = false) {
$sql = $this->escape($sql);
if ($result = $this->db->query($sql)) {
if ($type == false) {
return $result;
} elseif ($type == true || "assoc") {
if ($result->num_rows >= 2) {
$array;
$i = 1;
while ($row = $result->fetch_assoc()) {
$array[$i] = $row;
$i++;
}
return $array;
} elseif ($result->num_rows == 1) {
return $result->fetch_assoc();
}
} elseif ($type == "array") {
if ($result->num_rows >= 2) {
$array;
$i = 1;
while ($row = $result->fetch_array()) {
$array[$i] = $row;
$i++;
}
return $array;
} elseif ($result->num_rows == 1) {
return $result->fetch_array();
}
}
} else {
die("There was an error running the query, throwing error: " . $this->db->error);
}
}
Basically, in short, I took my entire database, deleted it, and remade another one based on the OOD mysqli (using the class mysqli) and reformatted it into a class that extends mysqli. A better look at the full script can be found here:
http://pastebin.com/Bc00hESn
And yes, it does what I want it to. It queries multiple rows, and I can handle them however I wish using the very same methods I planned to do them in. Thank you for the help.
I think you should be using mysql_fetch_assoc():
<?php
while ($row = $db->query($pagination->get_content())) {
print_r($row);
}
?>

Large processes in PhalconPHP

I have webapp that is logging application and I need backup/restore/import/export feature there. I did this successfully with laravel but have some complications with Phalcon. I don't see native functions in phalcon that would split on chunks execution of large php scripts.
The thing is that logs will be backed up and restored as well as imported by users in ADIF format (adif.org) I have parser for that format which converts file to array of arrays then every record should search through another table, containing 2000 regular expressions, and find 3-10 matches there and connect imported records in one table to those in another table (model relation hasMany) That means that every imported record should have quite some processing time. laravel did it somehow with 3500 records imported, I dont know how it will handle more. The average import will contain 10000 records and each of them need to be verified with 2000 regular expression.
The main issue is how to split this huge processing mount into smaller chunks so I wouldnt get timeouts?
Here is the function that could flawlessly do the job with adding 3862 records in one table and as a result of processing of every record add 8119 records in another table:
public function restoreAction()
{
$this->view->disable();
$user = Users::findFirst($this->session->auth['id']);
if ($this->request->isPost()) {
if ($this->request->isAjax()) {
$frontCache = new CacheData(array(
"lifetime" => 21600
));
$cache = new CacheFile($frontCache, array(
"cacheDir" => "../plc/app/cache/"
));
$cacheKey = $this->request->getPost('fileName').'.cache';
$records = $cache->get($cacheKey);
if ($records === null) {
$rowsPerChunk = 50;
$adifSource = AdifHelper::parseFile(BASE_URL.'/uploads/'.$user->getUsername().'/'.$this->request->getPost('fileName'));
$records = array_chunk($adifSource, $rowsPerChunk);
$key = array_keys($records);
$size = count($key);
}
for ($i = 0; $i < $size; $i++) {
if (!isset($records[$i])) {
break;
}
set_time_limit(50);
for ($j=0; $j < $rowsPerChunk; $j++) {
$result = $records[$i][$j];
if (!isset($result)) {
break;
}
if(isset($result['call'])) {
$p = new PrefixHelper($result['call']);
}
$bandId = (isset($result['band']) && (strlen($result['band']) > 2)) ? Bands::findFirstByName($result['band'])->getId() : null;
$infos = (isset($p)) ? $p->prefixInfo() : null;
if (is_array($infos)) {
if (isset($result['qsl_sent']) && ($result['qsl_sent'] == 'q')) {
$qsl_rcvd = 'R';
} else if (isset($result['eqsl_qsl_sent']) && ($result['eqsl_qsl_sent'] == 'c')) {
$qsl_rcvd = 'y';
} else if (isset($result['qsl_rcvd'])) {
$qsl_rcvd = $result['qsl_rcvd'];
} else {
$qsl_rcvd ='i';
}
$logRow = new Logs();
$logRow->setCall($result['call']);
$logRow->setDatetime(date('Y-m-d H:i:s',strtotime($result['qso_date'].' '.$result['time_on'])));
$logRow->setFreq(isset($result['freq']) ? $result['freq'] : 0);
$logRow->setRst($result['rst_sent']);
$logRow->setQslnote(isset($result['qslmsg']) ? $result['qslmsg'] : '');
$logRow->setComment(isset($result['comment']) ? $result['comment'] : '');
$logRow->setQslRcvd($qsl_rcvd);
$logRow->setQslVia(isset($result['qsl_sent_via']) ? $result['qsl_sent_via'] : 'e');
$logRow->band_id = $bandId;
$logRow->user_id = $this->session->auth['id'];
$success = $logRow->save();
if ($success) {
foreach ($infos as $info) {
if (is_object($info)) {
$inf = new Infos();
$inf->setLat($info->lat);
$inf->setLon($info->lon);
$inf->setCq($info->cq);
$inf->setItu($info->itu);
if (isset($result['iota'])) {
$inf->setIota($result['iota']);
}
if (isset($result['pfx'])) {
$inf->setPfx($result['pfx']);
}
if (isset($result['gridsquare'])) {
$inf->setGrid($result['gridsquare']);
} else if (isset($result['grid'])) {
$inf->setGrid($result['grid']);
}
$inf->qso_id = $logRow->getId();
$inf->prefix_id = $info->id;
$infSuccess[] = $inf->save();
}
}
}
}
}
sleep(1);
}
}
}
}
I know, the script needs a lot of improvement but for now the task was just to make it work.
I think that the good practice for large processing task in php is console applications, that doesn't have restrictions in execution time and can be setup with more memory for execution.
As for phalcon, it has builtin mechanism for running and processing cli tasks - Command Line Applications (this link will always point to the documentation of a phalcon latest version)

While Loop Through An Array

Creating a block builder that loops through blocks pulled form database in order.
if( loop_blocks()) {
while( loop_blocks()) {
if( have_block('standard-content-block')) {
echo 'standard-content-block';
}
elseif( have_block('executive-intro-block')) {
echo 'executive intro block';
}
}
}
My function loop_blocks pulls the blocks from the database in order and set the array as a global variable:
function loop_blocks() {
global $db;
$page_id = get_page_id();
$GLOBALS['loop_position'] = 0;
$loop_position = $GLOBALS['loop_position'];
$stm = $db->prepare("SELECT * FROM page_blocks WHERE page_id = :id ORDER BY `block_order` ASC");
$stm->execute(array(':id' => $page_id));
$res = $stm->fetchAll();
$GLOBALS['block_loop'] = $res;
if(!$res) {
return false;
} elseif(!$GLOBALS['block_loop'][$loop_position]) {
return false;
} else {
return true;
}
}
The function have_block gets the current loop position and determines whether the name as determined, exists in the array and increases the loop position:
function have_block($block_name) {
$loop_position = $GLOBALS['loop_position'];
if(!$GLOBALS['block_loop'][$loop_position]) {
return false;
} elseif(!$GLOBALS['block_loop'][$loop_position][block_name] = $block_name) {
return false;
} else {
$GLOBALS['loop_position'] = $loop_position+1;
return true;
}
}
This returns an infinite loop however and I can't figure out a way to move the while loop onto the next step?
EDIT I'm using a while loop because the function have_block will set-up a global variable for the current block id. This will then be used within a function called the_element. Such as:
if( loop_blocks()) {
while( loop_blocks()) {
if( have_block('standard-content-block')) {
the_element('heading');
}
}
}
If I don't use the function have_block to set this up, then I'd need to pass the block id from the foreach loop into every element as a second argument.
I fixed this by, as #Jim, noted I was re-setting the loop_position within loop_blocks() which was why the while loop was repeating infinitely. It was then a simple case of an error when typing:
} elseif(!$GLOBALS['block_loop'][$loop_position][block_name] = $block_name) {
return false;
Should have been:
} elseif($GLOBALS['block_loop'][$loop_position][block_name] != $block_name) {
return false;
Note that I had the exclamation point in the incorrect place.
This now works perfectly as I need.

Proper way of handling multiple cases

I have a method that can return 3 different cases
public function check_verification_status($user_id) {
global $db;
$sql = "SELECT * FROM `users`
WHERE `id` = ".clean($user_id)."
AND `type_id` = 1";
$result = #mysql_query($sql,$db); check_sql(mysql_error(), $sql, 0);
$list = mysql_fetch_array($result);
if ($list['verification_key'] == '' && !$list['verified']) {
//No key or verified
return 0;
} elseif ($list['verification_key'] != '' && !$list['verified']) {
//key exists but not verified = email sent
return 2;
} elseif ($list['verification_key'] != '' && $list['verified']) {
//verified
return 1;
}
}
A form / message is output depending on the return value from this
I would have used bool for return values when comparing 2 cases, what is the proper way of handling more than 2 cases and what would the ideal return value be.
The way i call this:
$v_status = $ver->check_verification_status($user_id);
if ($v_status === 0) {
//do something
} elseif ($v_status === 1) {
//do something else
} elseif ($v_status === 2) {
//do something totally different
}
I want to learn the right way of handling such cases as I run into them often.
note: I know I need to upgrage to mysqli or PDO, its coming soon
What you have is fine, but you can also use a switch statement:
$v_status = $ver->check_verification_status($user_id);
switch ($v_status) {
case 0: {
//do something
break;
}
case 1: {
//do something else
break;
}
case 2: {
//do something totally different
break;
}
}

silverstripe, how to use the doPublish()

I am working with SilverStripe, and I am working on making a newspage.
I use the DataObjectAsPage Module( http://www.ssbits.com/tutorials/2012/dataobject-as-pages-the-module/ ), I got it working when I use the admin to publish newsitems.
Now I want to use the DataObjectManager Module instead of the admin module to manage my news items. But this is where the problem exists. Everything works fine in draft mode, I can make a new newsitem and it shows up in draft. But when I want to publish a newsitem, it won't show up in the live or published mode.
I'm using the following tables:
-Dataobjectaspage table,
-Dataobjectaspage_live table,
-NewsArticle table,
-NewsArticle_Live table
The Articles have been inserted while publishing in the Dataobjectaspage table and in the NewsArticle table... But not in the _Live tables...
Seems the doPublish() function hasn't been used while 'Publishing'.
So I'm trying the use the following:
function onAfterWrite() {
parent::onAfterWrite();
DataObjectAsPage::doPublish();
}
But when I use this, it gets an error:
here is this picture
It seems to be in a loop....
I've got the NewsArticle.php file where I use this function:
function onAfterWrite() {
parent::onAfterWrite();
DataObjectAsPage::doPublish();
}
This function calls the DataObjectAsPage.php file and uses this code:
function doPublish() {
if (!$this->canPublish()) return false;
$original = Versioned::get_one_by_stage("DataObjectAsPage", "Live", "\"DataObjectAsPage\".\"ID\" = $this->ID");
if(!$original) $original = new DataObjectAsPage();
// Handle activities undertaken by decorators
$this->invokeWithExtensions('onBeforePublish', $original);
$this->Status = "Published";
//$this->PublishedByID = Member::currentUser()->ID;
$this->write();
$this->publish("Stage", "Live");
// Handle activities undertaken by decorators
$this->invokeWithExtensions('onAfterPublish', $original);
return true;
}
And then it goes to DataObject.php file and uses the write function ():
public function write($showDebug = false, $forceInsert = false, $forceWrite = false, $writeComponents = false) {
$firstWrite = false;
$this->brokenOnWrite = true;
$isNewRecord = false;
if(self::get_validation_enabled()) {
$valid = $this->validate();
if(!$valid->valid()) {
// Used by DODs to clean up after themselves, eg, Versioned
$this->extend('onAfterSkippedWrite');
throw new ValidationException($valid, "Validation error writing a $this->class object: " . $valid->message() . ". Object not written.", E_USER_WARNING);
return false;
}
}
$this->onBeforeWrite();
if($this->brokenOnWrite) {
user_error("$this->class has a broken onBeforeWrite() function. Make sure that you call parent::onBeforeWrite().", E_USER_ERROR);
}
// New record = everything has changed
if(($this->ID && is_numeric($this->ID)) && !$forceInsert) {
$dbCommand = 'update';
// Update the changed array with references to changed obj-fields
foreach($this->record as $k => $v) {
if(is_object($v) && method_exists($v, 'isChanged') && $v->isChanged()) {
$this->changed[$k] = true;
}
}
} else{
$dbCommand = 'insert';
$this->changed = array();
foreach($this->record as $k => $v) {
$this->changed[$k] = 2;
}
$firstWrite = true;
}
// No changes made
if($this->changed) {
foreach($this->getClassAncestry() as $ancestor) {
if(self::has_own_table($ancestor))
$ancestry[] = $ancestor;
}
// Look for some changes to make
if(!$forceInsert) unset($this->changed['ID']);
$hasChanges = false;
foreach($this->changed as $fieldName => $changed) {
if($changed) {
$hasChanges = true;
break;
}
}
if($hasChanges || $forceWrite || !$this->record['ID']) {
// New records have their insert into the base data table done first, so that they can pass the
// generated primary key on to the rest of the manipulation
$baseTable = $ancestry[0];
if((!isset($this->record['ID']) || !$this->record['ID']) && isset($ancestry[0])) {
DB::query("INSERT INTO \"{$baseTable}\" (\"Created\") VALUES (" . DB::getConn()->now() . ")");
$this->record['ID'] = DB::getGeneratedID($baseTable);
$this->changed['ID'] = 2;
$isNewRecord = true;
}
// Divvy up field saving into a number of database manipulations
$manipulation = array();
if(isset($ancestry) && is_array($ancestry)) {
foreach($ancestry as $idx => $class) {
$classSingleton = singleton($class);
foreach($this->record as $fieldName => $fieldValue) {
if(isset($this->changed[$fieldName]) && $this->changed[$fieldName] && $fieldType = $classSingleton->hasOwnTableDatabaseField($fieldName)) {
$fieldObj = $this->dbObject($fieldName);
if(!isset($manipulation[$class])) $manipulation[$class] = array();
// if database column doesn't correlate to a DBField instance...
if(!$fieldObj) {
$fieldObj = DBField::create('Varchar', $this->record[$fieldName], $fieldName);
}
// Both CompositeDBFields and regular fields need to be repopulated
$fieldObj->setValue($this->record[$fieldName], $this->record);
if($class != $baseTable || $fieldName!='ID')
$fieldObj->writeToManipulation($manipulation[$class]);
}
}
// Add the class name to the base object
if($idx == 0) {
$manipulation[$class]['fields']["LastEdited"] = "'".SS_Datetime::now()->Rfc2822()."'";
if($dbCommand == 'insert') {
$manipulation[$class]['fields']["Created"] = "'".SS_Datetime::now()->Rfc2822()."'";
//echo "<li>$this->class - " .get_class($this);
$manipulation[$class]['fields']["ClassName"] = "'$this->class'";
}
}
// In cases where there are no fields, this 'stub' will get picked up on
if(self::has_own_table($class)) {
$manipulation[$class]['command'] = $dbCommand;
$manipulation[$class]['id'] = $this->record['ID'];
} else {
unset($manipulation[$class]);
}
}
}
$this->extend('augmentWrite', $manipulation);
// New records have their insert into the base data table done first, so that they can pass the
// generated ID on to the rest of the manipulation
if(isset($isNewRecord) && $isNewRecord && isset($manipulation[$baseTable])) {
$manipulation[$baseTable]['command'] = 'update';
}
DB::manipulate($manipulation);
if(isset($isNewRecord) && $isNewRecord) {
DataObjectLog::addedObject($this);
} else {
DataObjectLog::changedObject($this);
}
$this->onAfterWrite();
$this->changed = null;
} elseif ( $showDebug ) {
echo "<b>Debug:</b> no changes for DataObject<br />";
// Used by DODs to clean up after themselves, eg, Versioned
$this->extend('onAfterSkippedWrite');
}
// Clears the cache for this object so get_one returns the correct object.
$this->flushCache();
if(!isset($this->record['Created'])) {
$this->record['Created'] = SS_Datetime::now()->Rfc2822();
}
$this->record['LastEdited'] = SS_Datetime::now()->Rfc2822();
} else {
// Used by DODs to clean up after themselves, eg, Versioned
$this->extend('onAfterSkippedWrite');
}
// Write ComponentSets as necessary
if($writeComponents) {
$this->writeComponents(true);
}
return $this->record['ID'];
}
Look at the $this->onAfterWrite();
It probably goes to my own function on NewsArticle.php and there starts the loop! I'm not sure though, so i could need some help!!
Does anyone knows how to use the doPublish() function?
The reason that is happening is that in the DataObjectAsPage::publish() method, it is calling ->write() - line 11 of your 3rd code sample.
So what happens is it calls ->write(), at the end of ->write() your onAfterWrite() method is called, which calls publish(), which calls write() again.
If you remove the onAfterWrite() function that you've added, it should work as expected.
The doPublish() method on DataObjectAsPage will take care of publishing from Stage to Live for you.

Categories