I created a script, that, for a game situation tries to find the best possible solution. It does this, by simulating each and every possible move, and quantifying them, thus deciding which is the best move to take (which will result in the fastest victory). To make it faster, I've implemented PHP's pthread, in the following way: each time the main thread needs to find a possible move (let's call this JOB), it calculates all the possible moves in the current depth, then starts a Pool, and adds to it, each possible move (let's call this TASK), so the threads develop the game tree for each move separately, for all the additional depths.
This would look something like this:
(1) Got a new job with 10 possible moves
(1) Created a new pool
(1) Added all jobs as tasks to the pool
(1) The tasks work concurently, and return an integer as a result, stored in a Volatile object
(1) The main thread selects a single move, and performs it
.... the same gets repeated at (1) until the fight is complete
Right now, the TASKS use their own caches, meaning while they work, they save caches and reuse them, but they do not share caches between themselves, and they do not take caches over from a JOB to another JOB. I tried to resolve this, and in a way managed, but I don't think this is the intended way, because it makes everything WAY slower.
What I tried to do is as follows: create a class, that will store all the cache hashes in arrays, then before creating the pool, add it to a Volatile object. Before a task is being run, it retrieves this cache, uses it for read/write operation, and when the task finished, it merges it with the instance which is in the Volatile object. This works, as in, the caches made in JOB 1, can be seen in JOB 2, but it makes the whole process way much slower, then it was, when each thread only used their own cache, which was built while building the tree, and then destroyed, when the thread finished. Am I doing this wrong, or the thing I want is simply not achieavable? Here's my code:
class BattlefieldWork extends Threaded {
public $taskId;
public $innerIterator;
public $thinkAhead;
public $originalBattlefield;
public $iteratedBattlefield;
public $hashes;
public function __construct($taskId, $thinkAhead, $innerIterator, Battlefield $originalBattlefield, Battlefield $iteratedBattlefield) {
$this->taskId = $taskId;
$this->innerIterator = $innerIterator;
$this->thinkAhead = $thinkAhead;
$this->originalBattlefield = $originalBattlefield;
$this->iteratedBattlefield = $iteratedBattlefield;
}
public function run() {
$result = 0;
$dataSet = $this->worker->getDataSet();
$HashClassShared = null;
$dataSet->synchronized(function ($dataSet) use(&$HashClassShared) {
$HashClassShared = $dataSet['hashes'];
}, $dataSet);
$myHashClass = clone $HashClassShared;
$thinkAhead = $this->thinkAhead;
$innerIterator = $this->innerIterator;
$originalBattlefield = $this->originalBattlefield;
$iteratedBattlefield = $this->iteratedBattlefield;
// the actual recursive function that will build the tree, and calculate a quantify for the move, this will use the hash I've created
$result = $this->performThinkAheadMoves($thinkAhead, $innerIterator, $originalBattlefield, $iteratedBattlefield, $myHashClass);
// I am trying to retrieve the common cache here, and upload the result of this thread
$HashClassShared = null;
$dataSet->synchronized(function($dataSet) use ($result, &$HashClassShared) {
// I am storing the result of this thread
$dataSet['results'][$this->taskId] = $result;
// I am merging the data I've collected in this thread with the data that is stored in the `Volatile` object
$HashClassShared = $dataSet['hashes'];
$HashClassShared = $HashClassShared->merge($myHashClass);
}, $dataSet);
}
}
This is how I create my tasks, my Volatile, and my Pool:
class Battlefield {
/* ... */
public function step() {
/* ... */
/* get the possible moves for the current depth, that is 0, and store them in an array, named $moves */
// $nextInnerIterator, is an int, which shows which hero must take an action after the current move
// $StartingBattlefield, is the zero point Battlefield, which will be used in quantification
foreach($moves as $moveid => $move) {
$moves[$moveid]['quantify'] = new BattlefieldWork($moveid, self::$thinkAhead, $nextInnerIterator, $StartingBattlefield, $this);
}
$Volatile = new Volatile();
$Volatile['results'] = array();
$Volatile['hashes'] = $this->HashClass;
$pool = new Pool(6, 'BattlefieldWorker', [$Volatile]);
foreach ($moves as $moveid => $move) {
if (is_a($moves[$moveid]['quantify'], 'BattlefieldWork')) {
$pool->submit($moves[$moveid]['quantify']);
}
}
while ($pool->collect());
$pool->shutdown();
$HashClass = $Volatile['hashes'];
$this->HashClass = $Volatile['hashes'];
foreach ($Volatile['results'] as $moveid => $partialResult) {
$moves[$moveid]['quantify'] = $partialResult;
}
/* The moves are ordered based on quantify, one is selected, and then if the battle is not yet finished, step is called again */
}
}
And here is how I am merging two hash classes:
class HashClass {
public $id = null;
public $cacheDir;
public $battlefieldHashes = array();
public $battlefieldCleanupHashes = array();
public $battlefieldMoveHashes = array();
public function merge(HashClass $HashClass) {
$this->battlefieldCleanupHashes = array_merge($this->battlefieldCleanupHashes, $HashClass->battlefieldCleanupHashes);
$this->battlefieldMoveHashes = array_merge($this->battlefieldMoveHashes, $HashClass->battlefieldMoveHashes);
return $this;
}
}
I've benchmarked each part of the code, to see where am I losing time, but everything seems to be fast enough to not warrant the time increase I am experiencing. What I am thinking is, that the problem lies in the Threads, sometimes, it seems that no job is being done at all, like they are waiting for some thread. Any insights on what could be the problem, would be greatly appreciated.
View in database I mean :
create view `vMaketType` as select * from MaketType
I have a view in database, but because of doctrine cant support it now, i using query, and fetch it one by one :
$em = $this->getDoctrine()->getManager();
$con = $this->getDoctrine()->getEntityManager()->getConnection();
$stmt = $con->executeQuery('SELECT * FROM vMaketType');
$domain = [];
//I must fetch it and set it one by one
foreach ($stmt->fetchAll() as $row){
$obj = new vMaketType();
$obj->setId($row["Id"]);
$obj->setName($row["Name"]);
$obj->setAmount($row["Amount"]);
array_push($domain, $obj);
}
for me this is really takes too much time to code one by one.
vMaketType is a custom entity I created to send data from controller to [Twig]view.
is there any easier way to fetch to array of object vMaketType?
because I have a view with 24 fields, I wish there is easier way for it.
Perhaps you can try with the serializer:
$obj = $this->get('serializer')->deserialize($row, 'Namespace\MaketType', 'array');
Code not tested, tweaks may be done, see the related doc.
If I use a Hydrator to put the values of my query results into an instance of my Model I'm going to have to loop through the results twice (once to hydrate each row/object and then again when I actually use the results).
I know there are various PDO_FETCH_* modes such as FETCH_INTO and FETCH_CLASS which seem to achieve the same functionality. The only difference I can see is the potential to manipulate the data during hydration.
I'm just trying to figure out the reasoning behind using hydrators when they require an additional iteration through the result set. I feel there has to be some reason or method to make that additional iteration acceptable and I'm curious what it is. I'm not sure why a programming pattern like hydrators is so populate because on the surface it seems like a waste in most situations. Is manipulating the data during hydration the only reason to use Hydrators?
"Find All" method from mapper:
($this->hydrator is an instance of Zend\Stdlib\Hydrator\ArraySerializable, but can be replaced with any other hydrator for the purpose of this question.)
/*
* #return array|PagesInterface[]
*/
public function findAll(){
//Perform select
$sql = "SELECT * FROM CMSMAIN";
$stmt = $this->dbal->prepare($sql);
$result = $stmt->execute();
if($result === true){
//$stmt->setFetchMode(\PDO::FETCH_CLASS, $modelGoesHere); //Using FETCH_CLASS populates the model directly eliminating the need for the extra iteration.
$records = $stmt->fetchAll();
$rows = array();
foreach($records as $value){ //First Loop to hydrate
$rows[] = $this->hydrator->hydrate($value, new PagesModel($this->logger));
}
return $rows;
}
return array();
}
I've made this class to handle all of my sql-queries. But I'm unsure of how to use it properly.
The class looks something like this (this is a VERY simple version of it):
class sql {
private $conn;
private $data;
function __construct() {
//makes connection to DB and sets $conn and $data
}
public function select($variables, $table, $criterias) {
//returns an array with all the info from DB
}
function __destruct() {
//closes the sql-connection
}
}
The question now is: Is this going to overload the DB, if I use it multiple times on every page-load? (refered to as Example #1)
$dbInfo = (new sql)->select($var,$tab,$cri);
$moreInfo = (new sql)->select($var2,$tab2,$cri2);
$evenMoreInfo = (new sql)->select($var3,$tab3,$cri3);
Would it be beneficial to make my sql class's methods static?
Or should I not create a new instance of a sql object every time I want to make a query (like the example below - refered to as Example #2)?
$sql = new sql();
$dbInfo = $sql->select($var,$tab,$cri);
$moreInfo = $sql->select($var2,$tab2,$cri2);
$evenMoreInfo = $sql->select($var3,$tab3,$cri3);
How and when is Example #1 the better choice over Example #2, and vice versa?
If I assume that Example #1 is going to take the most resources from the DB, when would you pick Example #1 over Example #2?
Your example 2 is more common to see, however the SQL object is usually a static/singleton. So it connects to the database once per server request.
Your base SQL object should handle connecting to a database and then handle basic input/output, such as executing a string of SQL and returning the results.
You can then add new objects on top of that for each object/table than then interfaces with this SQL singleton. These classes will handle constructing their custom SQL based on their table, joins, field names/types, etc.
E.g:
A very basic 'table' object looks like this
class SomeTableObject
{
m_TableName = 'SomeTable'; // Table to get Data from
function GetSelectSQL()
{
return "SELECT * FROM ".$this->m_TableName;
}
function Select($where)
{
$sql = $this->GetSelectSQL().$where;
return SqlSingleton::Execute($sql);
}
function GetByID($id)
{
$where = " WHERE FieldNameForID=$id";
return $this->Select($where);
}
}
These objects work better if they extend a base class that has those basic GetSelectSQL, TableName, Select, etc functions. The GetByIDs (and other gets, updates, inserts) will vary from table to table.
I'm using Doctrine with Symfony in a couple of web app projects.
I've optimised many of the queries in these projects to select just the fields needed from the database. But over time new features have been added and - in a couple of cases - additional fields are used in the code, causing the Doctrine lazy loader to re-query the database and driving the number of queries on some pages from 3 to 100+
So I need to update the original query to include all of the required fields. However, there doesn't seem an easy way for Doctrine to log which field causes the additional query to be issued - so it becomes a painstaking job to sift through the code looking for usage of fields which aren't in the original query.
Is there a way to have Doctrine log when a getter accesses a field that hasn't been hydrated?
I have not had this issue, but just looked at Doctrine_Record class. Have you tried adding some debug output to the _get() method? I think this part is where you should look for a solution:
if (array_key_exists($fieldName, $this->_data)) {
// check if the value is the Doctrine_Null object located in self::$_null)
if ($this->_data[$fieldName] === self::$_null && $load) {
$this->load();
}
Just turn on SQL logging and you can deduce the guilty one from alias names. For how to do it in Doctrine 1.2 see this post.
Basically: create a class which extends Doctrine_EventListener:
class QueryDebuggerListener extends Doctrine_EventListener
{
protected $queries;
public function preStmtExecute(Doctrine_Event $event)
{
$query = $event->getQuery();
$params = $event->getParams();
//the below makes some naive assumptions about the queries being logged
while (sizeof($params) > 0) {
$param = array_shift($params);
if (!is_numeric($param)) {
$param = sprintf("'%s'", $param);
}
$query = substr_replace($query, $param, strpos($query, '?'), 1);
}
$this->queries[] = $query;
}
public function getQueries()
{
return $this->queries;
}
}
And add the event listener:
$c = Doctrine_Manager::connection($conn);
$queryDbg = new QueryDebuggerListener();
$c->addListener($queryDbg);