I've got a php file echoing hashes from a MySQL database. This is necessary for a remote program I'm using, but at the same time I need my other php script opening and checking it for specified strings POST parsing. If it checks for the string pre-parsing, it'll just get the MySQL query rather than the strings to look for.
I'm not sure if any functions do this. Does fopen() read the file prior to parsing? or file_get_contents()?
If so, is there a function that'll read the file after the php and mysql code runs?
The file with the hashes query and echo is in the same directory as the php file reading it, if that makes a difference.
Perhaps fopen reads it post-parse, and I've done something wrong, but at first I was storing the hashes directly in the file, and it was working fine. After I changed it to echo the contents of the MySQL table, it bugged out.
The MySQL Query script:
$query="SELECT * FROM list";
$result=mysql_query($query);
while($row=mysql_fetch_array($result, MYSQL_ASSOC)){
echo $row['hash']."<br>";
}
What I was using to get the hash from this script before, when it was just a list of hashes:
$myFile = "hashes.php";
$fh = fopen($myFile, 'r');
$theData = fread($fh, filesize($myFile));
fclose($fh);
$mystring = $theData;
$findme = $hash;
$pos = strpos($mystring, $findme);
The easiest thing to do would be to modify your first php file which echoes everything, along these lines:
change every instance of echo to e.g. $data[] =
at the bottom, do foreach($data as $d) echo $d (this will produce the same result as you have right now)
you now still have your $data array which you can loop through and do whatever you like with it.
To provide working code examples, it would be great if you could post the current code of your file.
EDIT
If you change your script like so:
$query="SELECT * FROM list";
$result=mysql_query($query);
while($row=mysql_fetch_array($result, MYSQL_ASSOC)){
$data[] = $row['hash']."<br />";
}
foreach($data as $d) {
echo $d;
}
...you'll have the array $data that contains each hash in a key. You can then loop through this array like so:
foreach($data as $d) {
//do something
}
Related
I've got this variable which I want to save which is:
$counter = $counter['counter']++;
to make it so that it increments when I refresh the page.
Therefore, I decided to use the fopen, override the $counter variable and then save it.
I am still a beginner and I do not know how the fopen, fgets, fclose works.
So I wrote something like this.
$fp = fopen("file.php","r");
fwrite($fp, $counter);
fclose($fp);
So I wanted to open the file(file.php)(the file which is I am writing this code FYI), and then override the variable after it had been incremented and then save it by closing.
But the code doesn't seem to want to work and the variable does not seem to want to increment.
What am I doing wrong?
Do I want to have this $counter in a different file and pull the variable from there.
FYI: I don't want to use session_start() and $_SESSION because I am using cron job and it will not work.
EDIT
$result = mysql_query('SELECT MIN(ID) AS min, MAX(ID) AS max FROM ytable') or exit(mysql_error());
$row = mysql_fetch_assoc($result);
if($counter['counter'] < $row['max']){
if (isset($counter['counter'])){
$counter = $counter['counter']++;
}else{
$counter = $counter['counter'] = 0;
}
}
This is more of the code for those who are confused
You could read file like this. Open the file with previous counter & fetch the counter, increment it like this
$counter = readfile("file.php");
$counter++;
To write to the file, open the file in write mode like this
$fp = fopen("file.php","w");
fwrite($fp, $counter);
fclose($fp);
Is that result you are looking for ?
As chris85 suggested, database is used for this kind of cases. Better to use database instead of file handling.
So say you have file.txt and that contains 5. In that same directory you have script.php; this file would have
$counter = file_get_contents('file.txt'); // this takes in whatever value is in the file e.g in this case '5'
$counter++; // increment the value by 1 so we are now at 6
file_put_contents('file.txt', $counter); //write the value '6' back to the file
file.txt would then have 6, after the first load.
I am working on a PHP statement that runs a query and then writes the data to a .csv file. The problem I am having is that some of the data I am receiving from the server has commas in the data which causes for the .csv file to enter data in the wrong place. Below I have an example of the code.
$sql = "Select *
From table;"
$data = mysqli_query($link, $sql);
$row= ("Column One, Column Two, Column Three\n");
while ($result = $data->fetch_assoc()) {
$row .= ("$result[columnOne], $result[columnTwo], $result[columnThree]\n");
}
$fd = fopen("./filePath.csv", "w") or die ("Error Message");
fwrite($fd, $row);
fclose($fd);
Column three is where the data contains commas which causes for it to write to different cells in the .csv file. Is there any solution to make the $result[columnThree] data stay in one cell even though it contains commas in it?
You can wrap the values in double-quotes:
$row .= ('"'.$result['columnOne'].'", "'.$result['columnTwo'].'", "'.$result['columnThree'].'"\n"');
Instead of concatenating a string, I like to use arrays as much as possible:
$rawCsv = array();
while ($result = $data->fetch_assoc()) {
if (count($rawCsv) === 0)
$rawCsv[] = '"'.implode('","', array_keys($result )).'"';
$rawCsv[] = '"'.implode('","', $result ).'"';
}
$csvString = implode("\n", $rawCsv);
Both of these approaches will have a hard time with a different character in your data though -- the double quote. With that in mind, an even better alternative would be to use fopen and fputcsv to create your CSV data and you don't have to think about it.
If you plan to immediately offer the CSV data for download, you don't need a file at all, just dump it into the output butter:
ob_start();
$file_handle = fopen("php://output", 'w');
... if you do want to hang on to a file, then use fopen on the desired output file and skip the call to ob_start
Next, assemble your data:
fputcsv($file_handle, array(
'Your',
'headings',
'here'
));
while ($result = $data->fetch_assoc()) {
fputcsv($file_handle, array(
$result['Your'],
$result['data'],
$result['"here"']
));
}
fclose($file_handle);
... If you're using a file, then you're all set! If you are using the output buffer (no file used), you can grab the CSV data and send it to the browser directly:
$csv = ob_get_clean();
echo $csv; // should send headers first!
Be careful with output buffering, though, some frameworks/applications make use of it internally. If you're running in to problems with it, try using a file. If the file works, then your framework is probably already doing something with the output buffer.
Documentation
RFC 4180 Common Format and MIME Type for Comma-Separated Values (CSV) Files - https://www.rfc-editor.org/rfc/rfc4180
implode - http://php.net/function.implode
fopen - http://php.net/manual/en/function.fopen.php
fclose - http://php.net/manual/en/function.fclose.php
fputcsv - http://php.net/manual/en/function.fputcsv.php
ob_start - http://php.net/manual/en/function.ob-start.php
ob_get_clean - http://php.net/manual/en/function.ob-get-clean.php
This question already has answers here:
Need to write at beginning of file with PHP
(10 answers)
Closed 9 years ago.
Hi I want to append a row at the beginning of the file using php.
Lets say for example the file is containing the following contnet:
Hello Stack Overflow, you are really helping me a lot.
And now i Want to add a row on top of the repvious one like this:
www.stackoverflow.com
Hello Stack Overflow, you are really helping me a lot.
This is the code that I am having at the moment in a script.
$fp = fopen($file, 'a+') or die("can't open file");
$theOldData = fread($fp, filesize($file));
fclose($fp);
$fp = fopen($file, 'w+') or die("can't open file");
$toBeWriteToFile = $insertNewRow.$theOldData;
fwrite($fp, $toBeWriteToFile);
fclose($fp);
I want some optimal solution for it, as I am using it in a php script. Here are some solutions i found on here:
Need to write at beginning of file with PHP
which says the following to append at the beginning:
<?php
$file_data = "Stuff you want to add\n";
$file_data .= file_get_contents('database.txt');
file_put_contents('database.txt', $file_data);
?>
And other one here:
Using php, how to insert text without overwriting to the beginning of a text file
says the following:
$old_content = file_get_contents($file);
fwrite($file, $new_content."\n".$old_content);
So my final question is, which is the best method to use (I mean optimal) among all the above methods. Is there any better possibly than above?
Looking for your thoughts on this!!!.
function file_prepend ($string, $filename) {
$fileContent = file_get_contents ($filename);
file_put_contents ($filename, $string . "\n" . $fileContent);
}
usage :
file_prepend("couldn't connect to the database", 'database.logs');
My personal preference when writing to a file is to use file_put_contents
From the manual:
This function is identical to calling fopen(), fwrite() and fclose()
successively to write data to a file.
Because the function automatically handles those three functions for me I do not have to remember to close the resource after I'm done with it.
There is no really efficient way to write before the first line in a file. Both solutions mentioned in your questions create a new file from copying everything from the old one then write new data (and there is no much difference between the two methods).
If you are really after efficiency, ie avoiding the whole copy of the existing file, and you need to have the last inserted line being the first in the file, it all depends how you plan on using the file after it is created.
three files
Per you comment, you could create three files header, content and footer and output each of them in sequence ; that would avoid the copy even if header is created after content.
work reverse in one file
This method puts the file in memory (array).
Since you know you create the content before the header, always write lines in reverse order, footer, content, then header:
function write_reverse($lines, $file) { // $lines is an array
for($i=count($lines)-1 ; $i>=0 ; $i--) fwrite($file, $lines[$i]);
}
then you call write_reverse() first with footer, then content and finally header. Each time you want to add something at the beginning of the file, just write at the end...
Then to read the file for output
$lines = array();
while (($line = fgets($file)) !== false) $lines[] = $line;
// then print from last one
for ($i=count($lines)-1 ; $i>=0 ; $i--) echo $lines[$i];
Then there is another consideration: could you avoid using files at all - eg via PHP APC
You mean prepending. I suggest you read the line and replace it with next line without losing data.
<?php
$dataToBeAdded = "www.stackoverflow.com";
$file = "database.txt";
$handle = fopen($file, "r+");
$final_length = filesize($file) + strlen($dataToBeAdded );
$existingData = fread($handle, strlen($dataToBeAdded ));
rewind($handle);
$i = 1;
while (ftell($handle) < $final_length)
{
fwrite($handle, $dataToBeAdded );
$dataToBeAdded = $existingData ;
$existingData = fread($handle, strlen($dataToBeAdded ));
fseek($handle, $i * strlen($dataToBeAdded ));
$i++;
}
?>
I'm working on a project for a client - a wordpress plugin that creates and maintains a database of organization members. I'll note that this plugin creates a new table within the wordpress database (instead of dealing with the data as custom_post_type meta data). I've made a lot of modifications to much of the plugin, but I'm having an issue with a feature (that I've left unchanged).
One half of this feature does a csv import and insert, and that works great. The other half of this sequence is a feature to download the contents of this table as a csv. This part works fine on my local system, but fails when running from the server. I've poured over each portion of this script and everything seems to make sense. I'm, frankly, at a loss as to why it's failing.
The php file that contains the logic is simply linked to. The file:
<?php
// initiate wordpress
include('../../../wp-blog-header.php');
// phpinfo();
function fputcsv4($fh, $arr) {
$csv = "";
while (list($key, $val) = each($arr)) {
$val = str_replace('"', '""', $val);
$csv .= '"'.$val.'",';
}
$csv = substr($csv, 0, -1);
$csv .= "\n";
if (!#fwrite($fh, $csv))
return FALSE;
}
//get member info and column data
$table_name = $wpdb->prefix . "member_db";
$year = date ('Y');
$members = $wpdb->get_results("SELECT * FROM ".$table_name, ARRAY_A);
$columns = $wpdb->get_results("SHOW COLUMNS FROM ".$table_name, ARRAY_A);
// echo 'SQL: '.$sql.', RESULT: '.$result.'<br>';
//output headers
header("Content-type: application/octet-stream");
header("Content-Disposition: attachment; filename=\"members.csv\"");
//open output stream
$output = fopen("php://output",'w');
//output column headings
$data[0] = "ID";
$i = 1;
foreach ($columns as $column){
//DIAG: echo '<pre>'; print_r($column); echo '</pre>';
$field_name = '';
$words = explode("_", $column['Field']);
foreach ($words as $word) $field_name .= $word.' ';
if ( $column['Field'] != 'id' && $column['Field'] != 'date_updated' ) {
$data[$i] = ucwords($field_name);
$i++;
}
}
$data[$i] = "Date Updated";
fputcsv4($output, $data);
//output data
foreach ($members as $member){
// echo '<pre>'; print_r($member); echo '</pre>';
$data[0] = $member['id'];
$i = 1;
foreach ($columns as $column){
//DIAG: echo '<pre>'; print_r($column); echo '</pre>';
if ( $column['Field'] != 'id' && $column['Field'] != 'date_updated' ) {
$data[$i] = $member[$column['Field']];
$i++;
}
}
$data[$i] = $member['date_updated'];
//echo '<pre>'; print_r($data); echo '</pre>';
fputcsv4($output, $data);
}
fclose($output);
?>
So, obviously, a routine wherein a query is run, $output is established with fopen, each row is then formatted as comma delimited and fwrited, and finally the file is fclosed where it gets pushed to a local system.
The error that I'm getting (from the server) is
Error 6 (net::ERR_FILE_NOT_FOUND): The file or directory could not be found.
But it clearly is getting found, its just failing. If I enable phpinfo() (PHP Version 5.2.17) at the top of the file, I definitely get a response - notably Cannot modify header information (I'm pretty sure because phpinfo() has already generated a header). All the expected data does get printed to the bottom of the page (after all the phpinfo diagnostics), however, so that much at least is working correctly.
I am guessing there is something preventing the fopen, fwrite, or fclose functions from working properly (a server setting?), but I don't have enough experience with this to identify exactly what the problem is.
I'll note again that this works exactly as expected in my test environment (localhost/XAMPP, netbeans).
Any thoughts would be most appreciated.
update
Ok - spent some more time with this today. I've tried each of the suggested fixes, including #Rudu's writeCSVLine fix and #Fernando Costa's file_put_contents() recommendation. The fact is, they all work locally. Either just echoing or the fopen,fwrite,fclose routine, doesn't matter, works great.
What does seem to be a problem is the inclusion of the wp-blog-header.php at the start of the file and then the additional header() calls. (The path is definitely correct on the server, btw.)
If I comment out the include, I get a csv file downloaded with some errors planted in it (because $wpdb doesn't exist. And if comment out the headers, I get all my data printed to the page.
So... any ideas what could be going on here?
Some obvious conflict of the wordpress environment and the proper creation of a file.
Learning a lot, but no closer to an answer... Thinking I may need to just avoid the wordpress stuff and do a manual sql query.
Ok so I'm wondering why you've taken this approach. Nothing wrong with php://output but all it does is allow you to write to the output buffer the same way as print and echo... if you're having trouble with it, just use print or echo :) Any optimizations you could have got from using fwrite on the stream then gets lost by you string-building the $csv variable and then writing that in one go to the output stream (Not that optimizations are particularly necessary). All that in mind my solution (in keeping with your original design) would be this:
function escapeCSVcell($val) {
return str_replace('"','""',$val);
//What about new lines in values? Perhaps not relevant to your
// data but they'll mess up your output ;)
}
function writeCSVLine($arr) {
$first=true;
foreach ($arr as $v) {
if (!$first) {echo ",";}
$first=false;
echo "\"".escapeCSVcell($v)."\"";
}
echo "\n"; // May want to use \r\n depending on consuming script
}
Now use writeCSVLine in place of fputcsv4.
Ran into this same issue. Stumbled upon this thread which does the same thing but hooks into the 'plugins_loaded' action and exports the CSV then. https://wordpress.stackexchange.com/questions/3480/how-can-i-force-a-file-download-in-the-wordpress-backend
Exporting the CSV early eliminates the risk of the headers already being modified before you get to them.
heres my code
$cont = file_get_contents("users.txt");
$lines = explode("\n",$cont, true);
if(in_array('$name', $line)) {
echo "Error user $name in database";
exit;}
I have a file with users name in a text file then that get turned into an array ($lines)
I need it to search the array to see if the user name in in the text file
If the variable $name is a string of the name you would like to find in the array you created from the file it should look more like this:
$cont = file_get_contents("users.txt");
$lines = explode("\n", $cont, true);
if(in_array($name, $lines)) {
echo "Error user $name in database";
exit;
}
I would personally recommend going to route of regular expressions though. Also, I can see the "explode" being a memory hog as the file grows and potentially slowing the site drastically.
I have not looked into this deeply, but if you are going flat file you may want to try something along the lines of http://www.niblr.com/php-flat-file-search-script/ if there will be a large amount of users in the system. Or even just running a regular expresion against the results of file_get_contents.