First of all, english is not my first language so sorry if I make some mistakes. I am kinda new to PHP and MySQL and I am working on a little personal project. I am stuck on something that I really don't know how to do and I would like to have your opinions on the best way to do it.
I want to generate a file (using fwrite) that will contain this information :
class applications
{
private $_"value1";
private $_"value2";
private $_"value3";
etc depending on how many columns the table have
}
"value" are equal to the name of each columns of a table.
To get the columns names I am using INFORMATION_SCHEMA.COLUMNS in this request :
$columns = $db->prepare("SELECT COLUMN_NAME FROM INFORMATION_SCHEMA.COLUMNS WHERE table_name = '$_POST[dropdown]' AND table_schema = 'tp2'");
$columns->execute();
$resultCol = $columns->fetchAll();
To write the file I am using this code :
$ouv = fopen($dir."/".$nomfile.".class.".$ext, 'w+');
fwrite($ouv, $text);
The variable $text must contain the result of a foreach loop that looks something like that :
foreach($resultCol as $value3)
{
print_r("private "."$"."_".$value3[0].";"."<br/>") ;
}
This print_r would work if I would like to display it in a php page but I need to save it as a string in a file.
Since $text is a variable, I can't use a foreach in it and I really don't know how to do it.
So in your opinion, what would be the best/easiest way to do what I am trying to do ?
Steps:
Open file
Loop through the result set
read row column values
construct text
write into file
close the file
Example:
$ouv = fopen( $dir . "/" . $nomfile . ".class." . $ext, 'w+' );
foreach( $resultCol as $value3 )
{
$text = "private $_" . $value3[0] . ";" ;
fwrite( $ouv, $text );
}
fclose( $ouv );
Related
I have recently created a plugin for wordpress.First of all i would like to explain background about what plugin does. The scenario is, csv file is there in some path.Plugin import the csv file into database table once a day.also i have created a shortcode where the data is fetched from previously imported table.function is;
function shortcode_mysreView(){
$xcode="";
$column = $wpdb->get_results($wpdb->prepare(
"SELECT * FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_SCHEMA = %s AND TABLE_NAME = %s AND COLUMN_NAME = %s ",
'wordpress',
$table_name,
$col_name
));
$finder = " SELECT $col_name from " . $table_name . " WHERE code = " . $xcode;
$output = $wpdb->get_results($finder);
$myresult = $output[0]->$col_name;
return check_output($myresult);
}
shortcode;
add_shortcode('mysre', 'shortcode_mysreView');
shortcode used;
[mysre name="address"]
Now the problem is everytime i use the shorcode somewhere db query is run. I think it's not good practice. Please suggest me a way to perform it in a way that i need not have to touch query everytime shortcode is used.
My thoughts;
What if i fetch the data once a day when import is performed and store the data as a txt file.
Is there anything like session or something?
Please suggest me.
Run a crons job that query once a day why not save that day in wp options. And then call get_options when your shortcode is being used.
I have this function, and this deletes textfiles after a certain age from my database automatically.
$r = new textfiles;
$db = new DB;
$currTime_ori = $db->queryOneRow("SELECT NOW() as now");
...
if($this->site->textfilesretentiondays != 0)
{
echo "PostPrc : Deleting textfiles older than ".$this->site->textfilesretentiondays." days\n";
$result = $db->query(sprintf("select ID from textfiles where postdate < %s - interval %d day", $db->escapeString($currTime_ori["now"]), $this->site->textfilesretentiondays));
foreach ($result as $row)
$r->delete($row["ID"]);
}
Now I would edit this function so that at first all textfiles are automatically downloaded in a root directory /www/backup and then the script should delete the textfiles with the string $r->delete($row["ID"]);
At the moment I have no idea how I could implement this.
For me it's seems to be impossible to give you an completely answer to your question because leak of informations.
Do you store the whole file-content in database or only the path and filename?
It would help us to see whats the content of "$row" which represents one row from database.
If you just store the filename (and optionally the path) you could use the "copy" (http://php.net/manual/de/function.copy.php) function from php to copy the file to your backup-directory. Please note, you have to ensure that the user who's executing the script or running the web-server have the privileges to write into the directory.
You could add this functionality to class textfiles as as method like makeBackup.
There are few information, but I'll give it a try. If you want to backup the rows before deleting them, you can store them in .txt file in json_encoded form using this piece of code inserted in the FOREACH loop, before delete command:
$myfile = fopen("/www/backup/".$row["ID"].".txt", "w") or die("Unable to open file!");
$txt = json_encode($row);
fwrite($myfile, $txt);
fclose($myfile);
By your approach ..
function delete ($id){
$result = $db->query(sprintf("select * from textfiles where id=$id);
//if you have filepath use copy as SebTM suggested
$path = $row['path']; //assuming path is the column name in ur db
$filename = basename($path); //to get filename
$backup_location = '/www/backup/'.$filename;
copy($path, $backup_location);
//if you have data in db
$content = $row['data'] //assuming data to be backed up to file is in a field 'data'
$backup_location = '/www/backup/file.txt';
file_put_contents($backup_location, $content);
}
But this is not the most optimal approach , you could shift even the initial query into delete function above , and call delete function only once, instead of calling it in a loop ..
I have a question on how to go about the next phase of a project I am working on.
Phase I:
create a php script that scraped directory for all .txt file..
Open/parse each line, explode into array...
Loop through array picking out pieces of data that were needed and INSERTING everything into the database (120+ .txt files & 100k records inserted)..
this leads me to my next step,
Phase II:
I need to take a 'list' of several 10's of thousand of numbers..
loop through each one, using that piece of data (number) as the search term to QUERY the database.. if a match is found I need to grab a piece of data in a different column of the same record/row..
General thoughts/steps I plan to take
scrape directory to find 'source' text file.
open/parse 'source file'.... line by line...
explode each line by its delimiting character.. and grab the 'target search number'
dump each number into a 'master list' array...
loop through my 'master list' array.. using each number in my search (SELECT) statement..
if a match is found, grab a piece of data in another column in the matching/returned row (record)...
output this data.. either to screen or .txt file (havent decided on that step yet,..most likely text file through each returned number on a new line)
Specifics:
I am not sure how to go about doing a 'multiple' search/select statement like this?
How can I do multiple SELECT statements each with a unique search term? and also collect the returned column data?
is the DB fast enough to return the matching value/data in a loop like this? Do I need to wait/pause/delay somehow for the return data before iterating through the loop again?
thanks!
current function I am using/trying:
this is where I am currently:
$harNumArray2 = implode(',', $harNumArray);
//$harNumArray2 = '"' . implode('","', $harNumArray) . '"';
$query = "SELECT guar_nu FROM placements WHERE har_id IN ($harNumArray2)";
echo $query;
$match = mysql_query($query);
//$match = mysql_query('"' . $query . '"');
$results = $match;
echo("<BR><BR>");
print_r($results);
I get these outputs respectively:
Array ( [0] => sample_source.txt )
Total FILES TO GRAB HAR ID's FROM: 1
TOAL HARS FOUND IN ALL FILES: 5
SELECT guar_nu FROM placements WHERE har_id IN ("108383442","106620416","109570835","109700427","100022236")
&
Array ( [0] => sample_source.txt )
Total FILES TO GRAB HAR ID's FROM: 1
TOAL HARS FOUND IN ALL FILES: 5
SELECT guar_nu FROM placements WHERE har_id IN (108383442,106620416,109570835,109700427,100022236)
Where do I stick this to actually execute it now?
thanks!
update:
this code seems to be working 'ok'.. but I dont understand on how to handle the retirned data correctly.. I seem to only be outputting (printing) the last variable/rows data..instead of the entire list..
$harNumArray2 = implode(',', $harNumArray);
//$harNumArray2 = '"' . implode('","', $harNumArray) . '"';
//$query = "'SELECT guar_num FROM placements WHERE har_id IN ($harNumArray2)'";
$result = mysql_query("SELECT har_id, guar_num FROM placements WHERE har_id IN (" . $harNumArray2 . ")")
//$result = mysql_query("SELECT har_id, guar_num FROM placements WHERE har_id IN (0108383442,0106620416)")
or die(mysql_error());
// store the record of the "example" table into $row
$row = mysql_fetch_array($result);
$numRows = mysql_num_rows($result);
/*
while($row = #mysql_fetch_assoc($result) ){
// do something
echo("something <BR>");
}
*/
// Print out the contents of the entry
echo("TOTAL ROWS RETURNED : " . $numRows . "<BR>");
echo "HAR ID: ".$row['har_id'];
echo " GUAR ID: ".$row['guar_num'];
How do I handle this returned data properly?
thanks!
I don't know if this answers your question but I think you're asking about sub-queries. They're pretty straightforward and just look something like this
SELECT * FROM tbl1 WHERE id = (SELECT num FROM tbl2 WHERE id = 1);
That will only work if there is one unique value to that second subquery. If it returns multiple rows it will return a parse error. If you have to select multiple rows research JOIN statements. This can get you started
http://www.w3schools.com/sql/sql_join.asp
I am not sure how to go about doing a 'multiple' search/select statement like this?
With regards to a multiple select, (and I'll assume that you're using MySQL) you can perform that simply with the "IN" keyword:
for example:
SELECT *
FROM YOUR_TABLE
WHERE COLUMN_NAME IN (LIST, OF, SEARCH, VALUES, SEPARATED, BY COMMAS)
EDIT: following your updated code in the question.
just a point before we go on... you should try to avoid the mysql_ functions in PHP for new code, as they are about to be deprecated. Think about using the generic PHP DB handler PDO or the newer mysqli_ functions. More help on choosing the "right" API for you is here.
How do I handle this returned data properly?
For handling more than one row of data (which you are), you should use a loop. Something like the following should do it (and my example will use the mysqli_ functions - which are probably a little more similar to the API you've been using):
$mysqli = mysqli_connect("localhost", "user", "pass");
mysqli_select_db($mysqli, "YOUR_DB");
// make a comma separated list of the $ids.
$ids = join(", ", $id_list);
// note: you need to pass the db connection to many of these methods with the mysqli_ API
$results = mysqli_query($mysqli, "SELECT har_id, guar_num FROM placements WHERE har_id IN ($ids)");
$num_rows = mysqli_num_rows($results);
while ($row = mysqli_fetch_assoc($results)) {
echo "HAR_ID: ". $row["har_id"]. "\tGUAR_NUM: " . $row["guar_num"] . "\n";
}
Please be aware that this is very basic (and untested!) code, just to show the bare minimum of the steps. :)
I have been trying to get the complete meta information for the fields in a result set from Postgresql in php (something like mysql_fetch_field(), which gives a lots of info about the field definition). While I am able to use the following functions to find some information:
$name = pg_field_name($result, 1);
$table = pg_field_table($result, 1);
$type = pg_field_type($result, 1);
I could not find a way to get more details about whether the field allow null values, contains blob data (by field definition), is a primary,unique key by definition etc. The mysql_fetch_field() gives all of this information somehow, which is very useful.
I would really like some way to get that information from php directly, but if that is not possible, then maybe someone has created a routine that might be able to extract that info from a pgsql resultset somehow.
PS: This looks promising, but the warning on the page is not a good sign:
http://php.net/manual/en/pdostatement.getcolumnmeta.php
Also, I am not using PDO at the moment, but if there is no solution, then a PDo specific answer will suffice too.
You can find the metadata on you column with a query like this:
select * from information_schema.columns
where table_name = 'regions' and column_name = 'capital'
This should provide all of the information found in mysql_fetch_field. I'm not a php coder, so perhaps someone knows of a function that wraps this query.
All;
Was amazed to find pgsql does not have a column count routine in PHP. The helps I looked up all got total count from "information_schema.columns", which is not what you want when doing cell by cell processing.
So here is a couple of quick functions to use:
// Test Cell by Cell
echo "Testing Cell by Cell! <br>";
$sql = "SELECT * FROM sometable;
$res = pg_query($sql);
$r_cnt = pg_numrows($res) or die("Error: Row Count Failed!");
$c_cnt = pg_numcols($res) or die("Error: Col Count Failed!");
echo "C=> $c_cnt R=> $r_cnt <br>";
for ($n=1; $n<=$r_cnt; $n++) {
for ($x=1; $x<=$c_cnt; $x++) {
echo "Cell=> pg_result($res,$n,$x) <br>";
} // end while $x
} // end while $n
function pg_numcols($res) {
$spos = strpos(strtoupper($this->db_sql),'SELECT')+6;
$fpos = strpos(strtoupper($this->db_sql),'FROM');
$lenp = $fpos - $spos;
$t_str = substr($this->db_sql,$spos,$lenp);
$x_str = explode(',',trim($t_str));
$result = count($x_str);
return $result;
} // end function
Hope you enjoy!
Automatically build mySql table upon a CSV file upload.
I have a admin section where admin can upload CSV files with different column count and different column name.
which it should then build a mySql table in the db which will read the first line and create the columns and then import the data accordingly.
I am aware of a similar issue, but this is different because of the following specs.
The name of the Table should be the name of the file (minus the extension [.csv])
each csv file can be diffrent
Should build a table with number of columns and names from the CSV file
add the the data from the second line and on
Here is a design sketch
Maybe there are known frameworks that makes this easy.
Thanks.
$file = 'filename.csv';
$table = 'table_name';
// get structure from csv and insert db
ini_set('auto_detect_line_endings',TRUE);
$handle = fopen($file,'r');
// first row, structure
if ( ($data = fgetcsv($handle) ) === FALSE ) {
echo "Cannot read from csv $file";die();
}
$fields = array();
$field_count = 0;
for($i=0;$i<count($data); $i++) {
$f = strtolower(trim($data[$i]));
if ($f) {
// normalize the field name, strip to 20 chars if too long
$f = substr(preg_replace ('/[^0-9a-z]/', '_', $f), 0, 20);
$field_count++;
$fields[] = $f.' VARCHAR(50)';
}
}
$sql = "CREATE TABLE $table (" . implode(', ', $fields) . ')';
echo $sql . "<br /><br />";
// $db->query($sql);
while ( ($data = fgetcsv($handle) ) !== FALSE ) {
$fields = array();
for($i=0;$i<$field_count; $i++) {
$fields[] = '\''.addslashes($data[$i]).'\'';
}
$sql = "Insert into $table values(" . implode(', ', $fields) . ')';
echo $sql;
// $db->query($sql);
}
fclose($handle);
ini_set('auto_detect_line_endings',FALSE);
Maybe this function will help you.
fgetcsv
(PHP 4, PHP 5)
fgetcsv — Gets line from file pointer
and parse for CSV fields
http://php.net/manual/en/function.fgetcsv.php
http://bytes.com/topic/mysql/answers/746696-create-mysql-table-field-headings-line-csv-file has a good example of how to do this.
The second example should put you on the right track, there isn't some automatic way to do it so your going to need to do a lil programming but it shouldn't be too hard once you implement that code as a starting point.
Building a table is a query like any other and theoretically you could get the names of your columns from the first row of a csv file.
However, there are some practical problems:
How would you know what data type a certain column is?
How would you know what the indexes are?
How would you get data out of the table / how would you know what column represents what?
As you can´t relate your new table to anything else, you are kind of defeating the purpose of a relational database so you might as well just keep and use the csv file.
What you are describing sounds like an ETL tool. Perhaps Google for MySQL ETL tools...You are going to have to decide what OS and style you want.
Or just write your own...