Optimizing/improving data insert with Zend_Db_Table - php

On my ZF project, I'm importing data from a CSV file and after some treatement, I insert the data in my MySQL database with a Zend_Db_Table. Here's what the code looks like:
private function addPerson($data)
{
$personDao = new Person();
$personRow = $personDao ->createRow();
if($newperson == -1)
{
//already in DB
}
else
{
$personRow->setName($data['name']);
...
$personRow->save();
}
}
It's working just fine. My only concern is the time it'll take for thousands of rows to be inserted using this way.
So my question is: Is there anyway I can improve my code for large files?
Can I still use the save() function for a lot of rows (>6000) ?
Any suggestion will be welcome.
I was wondering if there's a ZEND function that can buffer like 500 rows and insert them in one shot instead of using save() on each row. I'm already at 1min for 6000 rows...

I think to optimize the integration of csv file, you must transfer the work to MySQL. Either stored or through PHP command line procedure.
It will be create a new file for your tables.
You will find ideas in Import CSV to mysql table.
I did not do, but I think it is quite feasible.
I hope it will help you :)

Related

Restore db backup with php web application

unfortunately i lost my database a few days ago but luckily i had daily back up.
unfortunately i lost my database(some of my data not whole db) a few days ago but luckily i have daily backup.Today i find out that i have to restore 180 huge database.
I need to compare every 180 databases(backups) with the current db(my db) and insert the data that is not any more in my db.
AT the beginning i want to use some application like(NAVICAT or db forge studio) but it's not possible that take a lot of time.
and also i wanted to compare the SQL test with each other and it's not possible too.
now i have to build a web application(PHP) to restore the databases but i don't know how
i'll be appreciate if anyone help me.
thank you.
If it is a large file, do not open it. Stream it.
$backup = fopen("yourBackUp.sql", "r");
// ...
fclose($backup);
If you have many files, use GLOB.
foreach(glob(dirname(__FILE__) . '/*.sql', GLOB_BRACE) as $sql) {
$backup = fopen($sql, "r");
// ...
}
If your DB is set-up correctly, you should not be able to insert duplicate rows.
while (($line = fgets($backup)) !== false)
try {
Illuminate\Support\Facades\DB::raw($line);
} catch ( Exception $e ) {
// duplicate row
}
}
Without seeing what you have done and your programmatic issue, we cannot help.

How to dump MySQL table to a file then read it and use it in place of the DB itself?

because a provider I use, has a quite unreliable MySQL servers, which are down at leas 1 time pr week :-/ impacting one of the sites I made, I want to prevent its outeges in the following way:
dump the MySQL table to a file In case the connection with the SQL
server is failed,
then read the file instead of the Server, till the Server is back.
This will avoid outages from the user experience point of view.
In fact things are not so easy like it seems and I ask for your help please.
What I did is to save the data to a JSON file format.
But this got issues because many data on the DB are "in clear" included escaped complex URLs, with long argument's line, that give some issue during the decode process from JSON.
On CSV and TSV is also not workign correctly.
CSV is delimited by Commas or Semilcolon , and those are present in the original content taken from the DB.
TSV format leave double quotes that are not deletable, without avoid to go to eliminate them into the record's fields
Then I tried to serialize each record read from the DB, store it and retrive it serializing it.
But the result is a bit catastrophic, becase all the records are stored in the file.
When I retrieve them, only one is returned. then there is something that blocks the functioning of the program (here below the code please)
require_once('variables.php');
require_once("database.php");
$file = "database.dmp";
$myfile = fopen($file, "w") or die("Unable to open file!");
$sql = mysql_query("SELECT * FROM song ORDER BY ID ASC");
// output data of each row
while ($row = mysql_fetch_assoc($sql)) {
// store the record into the file
fwrite($myfile, serialize($row));
}
fclose($myfile);
mysql_close();
// Retrieving section
$myfile = fopen($file, "r") or die("Unable to open file!");
// Till the file is not ended, continue to check it
while ( !feof($myfile) ) {
$record = fgets($myfile); // get the record
$row = unserialize($record); // unserialize it
print_r($row); // show if the variable has something on it
}
fclose($myfile);
I tried also to uuencode and also with base64_encode but they were worse choices.
Is there any way to achieve my goal?
Thank you very much in advance for your help
If you have your data layer well decoupled you can consider using SQLite as a fallback storage.
It's just a matter of adding one abstraction more, with the same code accessing the storage and changing the storage target in case of unavailability of the primary one.
-----EDIT-----
You could also try to think about some caching (json/html file?!) strategy returning stale data in case of mysql outage.
-----EDIT 2-----
If it's not too much effort, please consider playing with PDO, I'm quite sure you'll never look back and believe me this will help you structuring your db calls with little pain when switching between storages.
Please take the following only as an example, there are much better
way to design this architectural part of code.
Just a small and basic code to demonstrate you what I mean:
class StoragePersister
{
private $driver = 'mysql';
public function setDriver($driver)
{
$this->driver = $driver;
}
public function persist($data)
{
switch ($this->driver)
{
case 'mysql':
$this->persistToMysql($data);
case 'sqlite':
$this->persistToSqlite($data);
}
}
public function persistToMysql($data)
{
//query to mysql
}
public function persistSqlite($data)
{
//query to Sqlite
}
}
$storage = new StoragePersister;
$storage->setDriver('sqlite'); //eventually to switch to sqlite
$storage->persist($somedata); // this will use the strategy to call the function based on the storage driver you've selected.
-----EDIT 3-----
please give a look at the "strategy" design pattern section, I guess it can help to better understand what I mean.
After SELECT... you need to create a correct structure for inserting data, then you can serialize or what you want.
For example:
You have a row, you could do that - $sqls[] = "INSERT INTOsong(field1,field2,.. fieldN) VALUES(field1_value, field2_value, ... fieldN_value);";
Than you could serialize this $sqls, write into file, and when you need it, you could read, unserialize and make query.
Have you thought about caching your queries into a cache like APC ? Also, you may want to use mysqli or pdo instead of mysql (Mysql is deprecated in the latest versions of PHP).
To answer your question, this is one way of doing it.
var_export will export the variable as valid PHP code
require will put the content of the array into the $data variable (because of the return statement)
Here is the code :
$sql = mysql_query("SELECT * FROM song ORDER BY ID ASC");
$content = array();
// output data of each row
while ($row = mysql_fetch_assoc($sql)) {
// store the record into the file
$content[$row['ID']] = $row;
}
mysql_close();
$data = '<?php return ' . var_export($content, true) . ';';
file_put_contents($file, $data);
// Retrieving section
$rows = require $file;

PHP - chained triggers how to create/store?

I have been working with a small team on a small project about fiction world building. I have been assigned with the task of managing triggers/chained behavior of entities (rocks/places/items) that can be triggered in many ways such as throwing a magic rock into the lake and monster X will appear, and continue to trigger the things in a chain until it reaches the end.
I have tried this
$Trigger_123 = new stdClass();
$Trigger_123->name = "Name";
$Trigger_123->renderOn = ? // (object_345->throwedInLakeX) ?
How can I store this in MySQL? Can I have it checked if its a chain part? Also I tried MySQL triggers but I can't find a way to execute PHP on those triggers. Running PHP code on update or delete for example.
Cron jobs was not a option because many things will be added in the future and cron jobs will take a lot of time to finish, I was hoping of finding a more php-based solution.
Edited (adding some additional information)
I tried to implement this in many ways. I ended up with a system of dependecies pretty much like debian packages which I believe is not suited for this.
Database structure
Table "object"
--------------
ID (int)
Name (varchar)
Table "triggers"
----------------
ID (int)
Name (varchar)
Data (blob) // usually, I store php code and use eval to run
Table "attributes"
------------------
ID (int)
attribute (varchar)
value (blob)
Table "object_has_triggers"
---------------------------
ID (int)
ObjectID (int)
TriggerID (int)
Table "object_has_attributes"
-----------------------------
ID (int)
ObjectID (int)
AttributeID (int)
What I want as a result is to make a PHP code snippet execute each time
A database transaction , before is submitted and after to database
A object that has X triggers attached to it, resolve them
Each Trigger that is triggered by X be checked if all dependecies to it are satisfied
Question:
Is something like this even possible to build with PHP Or should I try other scripting languages like python?
what i want as a result is to make a PHP code snippet execute each time
A database transaction , before is submitted and after to database
You should call PHP function in trigger. Write all logic in PHP which required to invoke.
A object that has X triggers attached to it, resolve them
A object that has X triggers attached to it, rather convert it into PHP code or resolve it in PHP.
Each Trigger that is triggered by X be checked if all dependecies to it are satisfied
You can make one database table for saving responses after successful completing trigger. And at last you can check all dependencies are satisfied or not.
For calling PHP function from trigger, see different answers of following posts for different types of solutions.
Invoking a PHP script from a MySQL trigger
A PHP code snippet execute each time a database transaction , before is submitted and after to database
Don't reinvent the wheel, this has an incredible simple solution: have a layer on top of your database calls.
Instead of querying your database directly, call a function (perhaps in an object) that handles the database insertion of triggers. And it is right there that you can add your code to pre and post- process your triggers in whichever way you please.
function processDatabaseInsertion($trigger) {
//Preceding code goes here
//Database transaction goes here
//Post-processing code goes here
}
$Trigger_123 = new stdClass();
$Trigger_123->name = "Name";
$Trigger_123->renderOn = $object_345->throwedInLakeX;
processDatabaseInsertion($Trigger_123);
Over simplified, but you get the idea. I would recommend writing a custom class for your triggers but I wrote it in a procedural style since I don't know if you are familiar with OOP.
A PHP code snippet execute each time a object that has X triggers attached to it, resolve them
Same principle as before. If you use PHP >= 5.3, you can spice it up a bit using closures:
function processDatabaseInsertion($trigger) {
//Preceding code goes here
$trigger->renderOn();
//Database transaction goes here
//Post-processing code goes here
}
$Trigger_123 = new stdClass();
$Trigger_123->name = "Name";
$Trigger_123->renderOn = function() use ($Trigger_123) { doAwesomeThing($Trigger_123); }
processDatabaseInsertion($Trigger_123);
Or otherwise go for a more traditional approach:
function processDatabaseInsertion($trigger) {
//Preceding code goes here
switch($trigger->renderOn) {
case "awesomeThing":
doAwesomeThing($trigger);
break;
case "anotherThing":
break;
default:
break;
}
//Database transaction goes here
//Post-processing code goes here
}
$Trigger_123 = new stdClass();
$Trigger_123->name = "Name";
$Trigger_123->renderOn = "awesomeThing";
processDatabaseInsertion($Trigger_123);
Each Trigger that is triggered by X be checked if all dependecies to
it are satisfied
Can easily be handled by the methods above with some more PHP logic as you will be able to tell. You may want to have for instance a generic function called every time you need to process a trigger which in turns check if dependencies are satistifed and runs the specific trigger if so.
Either way there are better ways to tackle this problem than to use eval or some mysql trigger hacking as you can see :)

How to make a zipped download with files that exist in a database?

I want to make some kind of user panel for my users - after they will update the info on the panel it will make a new row for them in the database with an original source code which i build but with the edited fields they made.
Example:
UserName: [_______]
PageID: [_______]
They fill it in and the when they press update it will automatically insert the data to a pre-made code to a new field in the table.
<?php
$username = ? (whats the best way to insert UserName textarea value in here?)
$pageid = ? (whats the best way to insert UserName textarea value in here?)
?>
Now that was the first question: whats the best way to insert UserName textarea value in here?
The Second question is how to Auto Encrypt this on insert (I don't care about the way it will be encrypted, even if it will not be IonCube encrypted it will be fine)
And the last and the most important question is how to make an automatic function that when they will press "Update" will automatically make files from the SQL field and prompt them to download the zipped files with their files (I don't want to store any of those files on my server because they may interrupt one with the other cause there may be 100 users doing this action at the same time)
Guys trust me i has been looking for this answers all over the net and didn't found a thing.. (I found EVERYTHING i need except this stuff).
Thanks for future assistance guys!
Best Regards, Rico S.
1) The best way to do it is by using some sort of formatting like
Put you template like this
$template = "whats the best way to insert %%UserName%% %%textarea%% value in here.";
And then create an array with like
$trans = array ("%%UserName%%" => $username, "%%textarea%% => $textarea);
Then use php's strtr function to convert it
$data_to_store = strtr($template, $trans);
2) You can find a lot of encryption and decryption algorithms and php classes for doing that check out PHP Classes.
3) You could try this. But i am not 100% sure if its works properly.
Use PHP's ZipArchive Directory
And then load the content's into a string
then
<?php
header('Content-Disposition: attachment; filename="downloaded.pdf"');
$zip = new ZipArchive;
$res = $zip->open('php://stdout', ZipArchive::CREATE);
if ($res === TRUE) {
$zip->addFromString('file.txt', $content_populated_from_db);
$zip->close();
echo 'ok';
} else {
echo 'failed';
}
exit;
?>
I hope this works, If it didn't try changing the flags of ZipArchive::open. And if it didn't work then also. In that case let me know, with you code and i might be able to help you. As of this point, i havn't tried it.

PHP + Mysql queries for a real Beginner

After years of false starts, I'm finally diving head first into learning to code PHP. After about 10 failed previous attempts to learn, it's getting exciting and finally going fairly well.
The project I'm using to learn with is for work. I'm trying to import 100+ fixed width text files into a MySql database.
So far so good
I'm getting comfortable with sql, and I'm learning some php tricks, but I'm not sure how to tie all the pieces together. The basic structure for what I want to do goes something like the following:
Name the text file I want to import
Do a LOAD DATA INFILE to import the data into one field it to a temporary db
Use substring() to separate the fixed width file into real columns
Remove lines I don't want (file identifiers, subtotals, etc....)
Add the files in the temp db, to the main db
Drop the temp db and start again
As you can see in the attached code, thigns are working fine. It gets the new file, imports it to the temp table, removes unwanted lines and then moves the content to final main database. Perfect.
Questions three
My two questions are:
Am I doing this 'properly'? When I want to run a pile of queries one after anohter, do I keep assinging mysql_query to random variables?
How would I go about automating the script to loop through every file there and import them? Rather than have to change the file name and run the script every time.
And, last, what PHP function would I use to 'select' the file(s) I want to import? You know, like attaching a file to an email -> Browse for file, upload it, and then run the script on it?
Sorry for this being an ultra-beginner question, but I'm having trouble seeing how all the pieces fit together. Specifcally I'm wondering how multiple sql queries get strung together to form a script? The way I've done it below? Some other way?
Thanks x 100 for any insights!
Terry
<?php
// 1. Create db connection
$connection = mysql_connect("localhost","root","root") or die("DB connection failed:" . mysql_error());
// 2. Select the database
$db_select = mysql_select_db("pd",$connection) or die("Couldn't select the database:" . mysql_error());
?>
<?php
// 3. Perform db query
// Drop table import if it already exists
$q="DROP table IF EXISTS import";
//4. Make new import table with just one field
if ($newtable = mysql_query("CREATE TABLE import (main VARCHAR(700));", $connection)) {
echo "Table import made successfully" . "<br>";
} else{
echo "Table import was not made" . "<br>";
}
//5. LOAD DATA INFILE
$load_data = mysql_query("LOAD DATA INFILE '/users/terrysutton/Desktop/importmeMay2010.txt' INTO table import;", $connection) or die("Load data failed" . mysql_error());
//6. Cleanup unwanted lines
if ($cleanup = mysql_query("DELETE FROM import WHERE main LIKE '%GRAND%' OR main LIKE '%Subt%' OR main LIKE '%Subt%' OR main LIKE '%USER%' OR main LIKE '%DATE%' OR main LIKE '%FOR:%' OR main LIKE '%LOCATION%' OR main LIKE '%---%' OR `main` = '' OR `main` = '';")){
echo "Table import successfully cleaned up";
} else{
echo "Table import was not successfully cleaned up" . "<br>";
}
// 7. Next, make a table called "temp" to store the data before it gets imported to denominators
$temptable = mysql_query("CREATE TABLE temp
SELECT
SUBSTR(main,1,10) AS 'Unit',
SUBSTR(main,12,18) AS 'Description',
SUBSTR(main,31,5) AS 'BD Days',
SUBSTR(main,39,4) AS 'ADM',
SUBSTR(main,45,4) AS 'DIS',
SUBSTR(main,51,4) AS 'EXP',
SUBSTR(main,56,5) AS 'PD',
SUBSTR(main,100,5) AS 'YTDADM',
SUBSTR(main,106,5) AS 'YTDDIS',
SUBSTR(main,113,4) AS 'YTDEXP',
SUBSTR(main,118,5) AS 'YTDPD'
FROM import;");
// 8. Add a column for the date
$datecolumn = mysql_query("ALTER TABLE temp ADD Date VARCHAR(20) AFTER Unit;");
$date = mysql_query("UPDATE temp SET Date='APR 2010';");
// 8. Move data from the temp table to its final home in the main database
// Append data in temp table to denominator table
$append = mysql_query("INSERT INTO denominators SELECT * FROM temp;");
// 9. Drop import and temp tables to start from scratch.
$droptables = mysql_query("DROP TABLE import, temp;");
// 10. Next, rename the text file to be imported and do the whole thing over again.
?>
<?php
// 5. Close connection
mysql_close($connection);
?>
If you have access to the command like, you can do all your data loading right from the mysql command line. Further, you can automate the process by writing a shell script. Just because you can do something in PHP doesn't mean you should.
For instance, you can just install PHPMyAdmin, create your tables on the fly, then use mysqldump to dump your database definitions to a file. like so
mysqldump -u myusername -pmypassword mydatabase > mydatabase.backup.sql
later, you can then just reload the whole database
mysql -u myusername -pmypassword < mydatabase.backup.sql
It's cool that you are learning to do things in PHP, but focus on doing the stuff you will do in PHP regularly rather than doing RDBMS stuff in PHP which is not where you should do it most of the time anyway. Build forms, and process the data. Learn how to build objects, and why you might want to do that. Head over and check out Symphony and Doctrine. Learn about the Front Controller pattern.
Also, look into PDO. It is very "bad form" to use the direct mysql_query() functions anymore.
Finally, PHP is great for templating and including disparate parts to form a cohesive whole. Practice making a left and top navigation html file. Figure out how you can include that one file on all your pages so that your same navigation shows up everywhere.
Then figure out how to look at variables like the page name and highlight the navigation tab you are on. Those are the things PHP is well suited for.
Why don't you load the files and process them in PHP, and use it to insert values in the actual table?
Ie:
$data = file_get_contents('somefile');
// process data here, say you dump it into a 2d array like
// $insert[$rows][$cols]
// then you can insert these into the db, ie:
$query = '';
foreach ($insert as $row) {
$query .= "INSERT INTO table VALUES ({$row[1]}, {$row[2]}, {$row[3]});";
}
mysql_query($query);
The purpose behind setting mysql_query to a variable is so that you can get the data you were querying for. In the case of any other query than SELECT, it only returns true or false.
So in the case where you are using if ($var = mysql...) you do not need the variable assingment there at all as the function returns true or false.
Also, I feel like doing all your substring and data file processing would be MUCH better suited in PHP. you can look into the fopen function and the related functions on the left side of that page.

Categories