I have this code (which thanks to the users of stackoverflow I got the markup I needed :) ). However, I have come to a road that I have no knowledge of what so ever. I need to output this formatted table of the query to a text file on the server.
<?php
// Make a MySQL Connection
mysql_connect("hostname.net", "user", "pass") or die(mysql_error());
mysql_select_db("database") or die(mysql_error());
// Get all the data from the "example" table
$result = mysql_query("SELECT * FROM cards ORDER BY card_id")
or die(mysql_error());
echo "";
echo " Name AgeTitlebar ";
// keeps getting the next row until there are no more to get
while($row = mysql_fetch_array( $result )) {
// Print out the contents of each row into a table
echo "";
echo $row['card_id'];
echo "";
echo $row['title'];
echo "";
echo $row['item_bar'];
echo "";
}
echo "";
?>
I know I could use something similar to
<?php
$myFile = "test.txt";
$fh = fopen($myFile, 'w') or die("can't open file");
$stringData = "Bobby Bopper\n";
fwrite($fh, $stringData);
fclose($fh);
?>
but I am sure that it cant be the best solution. So I guess my question is does anyone know how to achieve this?
The nicest solution, particularly if you are short on memory, would be to put the writing into the loop:
$fh = fopen('cards.csv', 'w');
// keeps getting the next row until there are no more to get
while($row = mysql_fetch_array( $result )) {
fputcsv($fh, array($row['card_id'], $row['title'], $row['item_bar']), "\t");
}
fclose('cards.csv');
Note that I have used fputcsv to output the data in CSV format (using a tab as the delimiter). This should be easy to read by hand, and will also be easily understood by, for instance, a spreadsheet program. If you preferred a custom format, you should use fwrite as in your question.
Have a look at:
http://dev.mysql.com/doc/refman/5.0/en/select.html
especially:
[INTO OUTFILE 'file_name' export_options
| INTO DUMPFILE 'file_name'
| INTO var_name [, var_name]]
Something like this?
// Make sure the file exists, do some checks.
// Loop trough result set and append anything to the file.
while (false !== ($aRow = mysql_fetch_assoc($rResult))) {
$sCreateString = $aRow['field1'].';'.$aRow['field2'];
file_put_contents('example.txt', $sCreateString, FILE_APPEND);
}
// Done
If you need an exact dump of the database table, there are better options. (much better actually).
If you want to write a text file, then there's nothing wrong with what you've suggested.
There's lots of information in the PHP manual: http://www.php.net/manual/en/ref.filesystem.php
It depends entirely on how you want to store the data inside the file. You can create your own flat-file database format if you wish, and extract the data using PHP once you've read in the file.
Related
I have this function, and this deletes textfiles after a certain age from my database automatically.
$r = new textfiles;
$db = new DB;
$currTime_ori = $db->queryOneRow("SELECT NOW() as now");
...
if($this->site->textfilesretentiondays != 0)
{
echo "PostPrc : Deleting textfiles older than ".$this->site->textfilesretentiondays." days\n";
$result = $db->query(sprintf("select ID from textfiles where postdate < %s - interval %d day", $db->escapeString($currTime_ori["now"]), $this->site->textfilesretentiondays));
foreach ($result as $row)
$r->delete($row["ID"]);
}
Now I would edit this function so that at first all textfiles are automatically downloaded in a root directory /www/backup and then the script should delete the textfiles with the string $r->delete($row["ID"]);
At the moment I have no idea how I could implement this.
For me it's seems to be impossible to give you an completely answer to your question because leak of informations.
Do you store the whole file-content in database or only the path and filename?
It would help us to see whats the content of "$row" which represents one row from database.
If you just store the filename (and optionally the path) you could use the "copy" (http://php.net/manual/de/function.copy.php) function from php to copy the file to your backup-directory. Please note, you have to ensure that the user who's executing the script or running the web-server have the privileges to write into the directory.
You could add this functionality to class textfiles as as method like makeBackup.
There are few information, but I'll give it a try. If you want to backup the rows before deleting them, you can store them in .txt file in json_encoded form using this piece of code inserted in the FOREACH loop, before delete command:
$myfile = fopen("/www/backup/".$row["ID"].".txt", "w") or die("Unable to open file!");
$txt = json_encode($row);
fwrite($myfile, $txt);
fclose($myfile);
By your approach ..
function delete ($id){
$result = $db->query(sprintf("select * from textfiles where id=$id);
//if you have filepath use copy as SebTM suggested
$path = $row['path']; //assuming path is the column name in ur db
$filename = basename($path); //to get filename
$backup_location = '/www/backup/'.$filename;
copy($path, $backup_location);
//if you have data in db
$content = $row['data'] //assuming data to be backed up to file is in a field 'data'
$backup_location = '/www/backup/file.txt';
file_put_contents($backup_location, $content);
}
But this is not the most optimal approach , you could shift even the initial query into delete function above , and call delete function only once, instead of calling it in a loop ..
Basically what I need is to make queries that depend on the values of a text file that I'm reading.
This is the code that I have.
$con = mysqli_connect($host,$name,$pass,$db);
$file = fopen($_FILES["file"]["tmp_name"], "r") or die("Unable to open file!");
while(!feof($file))
{
$codigo = fgets($file);
$result = mysqli_query($con,"SELECT column FROM table WHERE column='". $codigo."'");
$row = mysqli_fetch_array($result);
echo $row;
}
fclose($file);
mysqli_close($con);
But I can´t get that I want. I have some values in my text file but when I excecute this I only get the query of the last value of my text file.
Any suggestion?
If I add this line inside the while:
echo $codigo . "<br>";
All the content of my text file are printed, so I think the problem is in the result variable. All the values of the text file return a null value when I make the query except the last.
Try
print_r($row);
or
echo $row["column"];
instead of
echo $row;
Otherwise you will always have "Array" returned.
The solution is someting stupid, when I read a file line by line, PHP also read the newline (\n) so when I make a query the where condition never is true.
A quick solution is make a substring.
What you are doing wrong is this line:
$row = mysqli_fetch_array($result);
And that's why you only see the last result. You keep overwriting that value with the new result. Instead of rewriting the string, keep adding to the array, then after the loop var_dump or print_r the array.
$row[] = mysqli_fetch_array($result);
It also might help you to use mysqli in an object-oriented fashion, the way it was designed:
$result = $con->query($query);
I have this issue where cron runs a php script every 5 minutes to update a list.
However, the list fails to update 5% of the time, and the list ends up blank. I don't believe it's related to cron, because I think I failed to manually generate the list twice out of like 100 tries.
What I believe it's related to is when the site has like 50+ people on it, it will fail to generate, perhaps being related to the server being busy. I added a check to make sure it's not MySQL not returning rows (which seems impossible) but it still does it leads me to believe fwrite is failing.
<?
$fileHandle = fopen("latest.html", 'w');
$links = array();
$query1 = $db_conn -> query("SELECT * FROM `views` ORDER BY `date` DESC LIMIT 0,20");
while ($result1 = $db_conn -> fetch_row($query1))
{
$result2 = $db_conn -> fetch_query("SELECT * FROM `title` WHERE `id` = '" . $result1['id'] . "'");
array_push($links, "<a href='/title/" . $result2['title'] . "'>" . $result2['title'] . "</a>");
}
if (count($links) > 0)
fwrite($fileHandle, implode(" • ", $links));
else
echo "Didn't work!";
fclose($fileHandle);
?>
Could there be a slight chance the file is in use so it ends up not working and writing a blank list?
$fileHandle = "latest.html", 'w');
I'm going to assume you mean
$fileHandle = fopen("latest.html", 'w');
the 'w' here opens the file, places the cursor at the start and truncates the file to zero length.
If you check count($links) before doing this you wont truncate the file when there is nothing to be written to it.
<?php
$links = "QUERY HERE AND HANDLE THE RESULTS (REMOVED)";
if (count($links) > 0)
{
$fileHandle = fopen("latest.html", 'w');
fwrite($fileHandle, implode(" • ", $links));
fclose($fileHandle);
}
else
{
echo "Didn't work!";
}
?>
Could there be a slight chance the file is in use so it ends up not
working and writing a blank list?
Well, yes. We don't know what other code you run that manipulates latest.html, so we can't really profile it.
Here are some suggestions:
Fix the syntax error in your file handler creation
You can acquire a fopen('w') handler to a file that has an existing fopen('r') process going on, so be sure to use PHP's flock while writing to the file to ensure other processes don't corrupt your list
Check to see what your logs have to say
Write to a string, then fwrite the entire string, so you spend less time in your inner loop with your file handler open (especially in this case where it doesn't eem that the string would be that long -- list of links)
Try outputting your links (datetamped) to a separate file besides latest.html; in the 5% chance when it fails, look back at the timetamped links and see how they compare. You can also include your query in that file so you can isolate if the issue is somthing to do with the DB or to do with writing to latest.html -- this will be especially useful in the case where your query (which isn't shown) possibly returns no results.
I think you are leaving yourself open to the possibility that the query is returning no data. The "removed" logic from your example may help shed light on what's going on. A good way of figuring this out is to write something to a log file, and check that log file after a few dozen iterations of your script. In the interest of having something in your latest.html file, I'd use file_put_contents over your current code.
<?php
$links = array();
$query = "SELECT links FROM tableA";
$result = mysql_query($links);
while ($row = mysql_fetch_row($result)) {
$links[] = $row[0];
}
if (count($links) > 0) {
file_put_contents('latest.html', implode(" * ", $links));
file_put_contents('linkupdate.log', "got links: " . count($links) . "\n", FILE_APPEND);
} else {
file_put_contents('linkupdate.log', "No links? [(" . mysql_errno() . ") " . mysql_error() . "]\n", FILE_APPEND);
}
?>
If we find no links, we won't overwrite the previous data file. If we encounter a MySQL error that might be causing the problem, it'll show up in the log output.
A read on the file shouldn't block a write, but switching to file_put_contents will help reduce the time the file is open and empty (there is some latency while you're performing the query and fetching the results).
Feel free to anonymize your query and post that as well - you definitely could have a problem with the result set since your code otherwise seems like it ought to work.
How would I go about populating a database from info in a csv file using PHP code? I need to practice using php to make database calls but at the moment, all I have access to is this csv file...
Design Considerations:
You probably don't want to load the entire file into memory at once using a function like file_get_contents. With large files this will eat up all of your available memory and cause problems. Instead do like Adam suggested, and read one line at a time.
fgetcsv at php manual
//Here's how you would start your database connection
mysql_connect($serverName, $username, $password);
mysql_select_db('yourDBName');
//open the file as read-only
$file = fopen("file.csv", "r");
// lineLength is unlimited when set to 0
// comma delimited
while($data = fgetcsv($file, $lineLength = 0, $delimiter = ",")) {
//You should sanitize your inputs first, using a function like addslashes
$success = mysql_query("INSERT INTO fileTable VALUES(".$data[0].",".$data[1].")");
if(!$success) {
throw new Exception('failed to insert!');
}
}
just do it through phpmyadmin: http://vegdave.wordpress.com/2007/05/19/import-a-csv-file-to-mysql-via-phpmyadmin/
Use the built-in PHP functions to read the CSV and write an output file. Then you can import the SQL into your database. This should work with any type of database.
Don't forget to escape any strings you are using. I used sqlite_escape_string() for that purpose in this example.
$fd = fopen("mydata.csv", "r");
$fdout = fopen("importscript.sql","w");
while(!feof($fd))
{
$line = fgetcsv($fd, 1024); // Read a line of CSV
fwrite($fdout,'INSERT INTO mytable (id,name)'
.'VALUES ('.intval($line[0]).",'".sqlite_escape_string($line[1])."');\r\n";
}
fclose($fdout);
fclose($fd);
i need to export the results of this query into a .csv so i can create a chart i just haven't any idea how to go about it and im still semi new to php thanks for any help.
$query="SELECT familyID, Fam_End_Date, Fam_Start_Date,
DATEDIFF(date(Fam_End_Date), date(Fam_Start_Date))
AS Days_Between,
TIMEDIFF(time(Fam_Start_Date), time(Fam_End_Date))
AS Time_Between
FROM family
WHERE Fam_End_Date IS NOT NULL
AND Fam_Start_Date IS NOT NULL
AND year(Fam_Start_Date)='$year'";
$result = mysql_db_query($aidDB, $query, $connection);
Try iterating thru the result set and use fputcsv to write the rows to a file.
http://php.net/manual/en/function.fputcsv.php
For example:
//continuing from your code above:
$fp = fopen('file.csv', 'w');
while ($row = mysql_fetch_assoc($result)) {
fputcsv($fp,$row);
}
fclose($fp);
it is very strait-forward though...
check this:
http://snipplr.com/view/2234/export-mysql-query-results-to-csv/