below is some code I am using to "translate" a map array into SQL code so I can easily update my database when I have updated my game map. As you can see it prints out the SQL code onto the screen so I can copy and paste it.
As my maps will get bigger this will become inefficient as it will crash the browser due to mass output, so instead I am wondering if it is possible to make it create a .txt file and write all of the data to it instead of printing onto the screen?
<?php
if (isset($_POST['code'])){
$map = $_POST['code'];
$map = preg_replace("/,\\s*}/i", "}", $map);
$map = str_replace("{", "[", $map);
$map = str_replace("}", "]", $map);
$map = json_decode('[' . $map . ']');
$arrayCount1 = 0;
$arrayCount2 = -1;
$H = sprintf('%05d', 00000);
$V = sprintf('%05d', 00000);
$id = 1;
echo "INSERT INTO `map` (`id`, `horizontal`, `verticle`, `image`) VALUES" . "<br />";
for ($count1 = 0; $count1 < sizeof($map[0]); $count1++){
$arrayCount2++;
$arrayCount1 = 0;
$V = sprintf('%05d', $V + 1);
$H = sprintf('%05d', 00000);
for ($count2 = 0; $count2 < sizeof($map); $count2++){
echo "(" . $id . ", '" . $H . "', '" . $V . "', '" . $map[$arrayCount1][$arrayCount2] . "')," . "<br />";
$arrayCount1++;
$id++;
$H = sprintf('%05d', $H + 1);
}
}
}
?>
That should be quite simple. Add
// second parameter 'a' stands for APPEND
$f = fopen('/path/to/the/file/you/want/to/write/to', 'a');
to the beginning of your script.
Add
fclose($f);
to the end fo your script to cleanly close the file handle (good pratice even though handles would be closed the the terminating script automatically).
And the change all your echo's and prints to
fwrite($f, '<<your string>>');
EDIT:
That way you can even compress the data on the fly using a compression stream wrapper if amnount of data gets really large.
There is an even simpler approach:
ob_start();
# Your code here ...
file_put_contents('yourfile.txt', ob_get_clean());
If this is something you plan on writing on a regular interval or by different scripts, look at using flock() to lock the file and prevent data corruption.
$fp = fopen("/tmp/lock.txt", "w+");
if (flock($fp, LOCK_EX)) { // do an exclusive lock
fwrite($fp, "Write something here\n");
flock($fp, LOCK_UN); // release the lock
} else {
echo "Couldn't lock the file !";
}
fclose($fp);
$str = <<<your string comes here>>>
if( $fh = #fopen( "myfile.txt", "a+" ) ) {
fputs( $fh, $str, strlen($str) );
fclose( $fh );
}
this should do...
write all the lines and then send the file to the client.
Check this post for further instructions
+my 2 cents:
You may check your database servers mass data loading features, as most of them can load files in batch faster than performing thousands of inserts.
Related
I am trying to write a file to a database 500 lines at a time so I do not run low on memory by avoiding dealing with very large arrays. For some reason, I am not getting any errors, but I am seeing a very, very small fraction entered into my table.
$ln = intval(shell_exec("wc -l $text_filename_with_path"));
echo "FILENAME WITH PATH: " . $text_filename_with_path ."\n\n";
echo "ARRAY LENGTH: " . $ln . "\n\n";
//pointer is initialized at zero
$fp = fopen($text_filename_with_path, "r");
$offset = 0;
$c = 0;
while($offset < $ln){
$row_limit = 500;
//get a 500 row section of the file
$chunk = fgets($fp, $row_limit);
//prepare for `pg_copy_from` by exploding to array
$chunk = explode("\n", $chunk);
//each record from the file being read is just one element
//prepare for three column DB table by adding columns (one
//unique PK built from UNIX time concat with counter, the
//other from a non-unique batch ID)
array_walk($chunk,
function (&$item, $key) use ($datetime, $c) {
$item = time() . $c . $key . "\t" . $datetime . "\t" . $item;
}
);
//increase offset to in order to move pointer forward
$offset += $row_limit;
//set pointer ahead to new position
fseek($fp, $offset);
echo "CURRENT POINTER: " . ftell($fp) . "\n"; //prints out 500, 1000, 1500 as expected
//insert array directly into DB from array
pg_copy_from($con, "ops.log_cache_test", $chunk, "\t", "\\NULL");
//increment to keep PK column unique
$c++;
}
I am getting as I say a fraction of the contents of the file, and lots of the data looks a bit messed up, eg about have the entries are blank in the part of the array element that gets assigned by $item within my array_walk() callback. Further it seems that exploding on \n is not working properly as lines seem exploded at ununiform positions (ie, log records don't look symmetrical). Have I just made a total mess out of this
You are not using fgets properly (2nd parameter isn't the number of rows);
There are two ways I can think of at the moment to solve it:
1. A loop getting one line at a time, until you've reached your row limit.
code should look something like this (not tested, assuming the end of line char is "\n" and no "\r")
<?php
/**Your code and initialization here*/
while (!feof($file)){
$counter = 0;
$buffer = array();
while (($line = fgets($file)) !== false && $counter < $row_limit) {
$line = str_replace("\n", "", $line); // fgets gets the line with the newline char at the end of line.
$buffer[] = $line;
$counter++;
}
insertRows($rows);
}
function insertRows($rows){
/** your code here */
}?>
Assuming the file isn't too big- using file_get_contents();
code should look something like this (same assumptions)
<?php
/**Your code and initialization here*/
$data = file_get_contents($filename);
if ($data === FALSE )
echo "Could not get content for file $filename\n";
$data = explode("\n",$data);
for ($offset=0;$offset<count($data);$offset+=$row_limit){
insertRows(array_slice ($rows,$offset,$row_limit));
}
function insertRows($rows){
/** your code here */
}
I didn't test it, so I hope it's ok.
I'm working on a logger class , the JSON that is being added has the following format
{"log_owner" : "test123","log_message" : "Has logged in","log_timestamp" : "1397921556","log_type" : "1"}
To retrieve it I require square brackets around all the different JSON Objects as the following :
[
{"log_owner" : "test456","log_message" : "Has logged in","log_timestamp" : "1397921856","log_type" : "2"}
{"log_owner" : "test123","log_message" : "Has logged in","log_timestamp" : "1397921556","log_type" : "1"}
]
I've managed to insert it at the beginning of the file whenever the file didn't existed, but the main issue resides on moving the closing bracket to the end of the file as I'm adding new objects , I've tried to move my file pointer 2 places before to be able to overwrite the last bracket keep adding the closing bracket to every new entry, I'm trying to accomplish this with :
if(!$content_new) {
$pos=ftell($handle);
fseek($handle,$pos-2);
}
fwrite($handle, $content);
fclose($handle);
But seems that It does not work with .json files, as I'm not able to move my file pointer to any other line or rewind it.
How could I accomplish this ? Any kind of guidance or suggestions for improvement is highly appreciated .
Thank you.
Direct solution for your problem - quick and dirty.
<?php
function writeLog($path, $newLine)
{
$exists = file_exists($path);
$handle = fopen($path, 'c');
if (!$exists) {
// first write to log file
$line = "[" . PHP_EOL . $newLine . PHP_EOL . "]";
} else {
// file exists so it has one or more logged lines
$line = "," . PHP_EOL . $newLine . PHP_EOL . "]";
fseek($handle , -(strlen(PHP_EOL) + 1) , SEEK_END);
}
fwrite($handle, $line);
fclose($handle);
}
$path = __DIR__ . '/file.json';
// delete file if exists - for tests
if (file_exists($path)) {
unlink($path);
}
$line = '{"log_owner" : "test123","log_message" : "Has logged in","log_timestamp" : "1397921556","log_type" : "1"}';
for ($i = 0; $i < 10; $i++) {
writeLog($path, $line);
}
Problems
concurrency
scaling
JSON is not easiest to edit and view
no easy filtering
Use CSV, output JSON
<?php
function writeLogCSV($path, $newLine)
{
$handle = fopen($path, 'a');
fputcsv($handle, $newLine);
fclose($handle);
}
function readLogCsv ($path)
{
$handle = fopen($path, 'r');
$rows = [];
while (false !== ($line = fgetcsv($handle))) {
$rows[] = array_combine(
["log_owner", "log_message", "log_timestamp", "log_type"],
$line
);
}
fclose($handle);
echo json_encode($rows);
}
$path = __DIR__ . '/file.csv';
// delete file if exists - for tests
if (file_exists($path)) {
unlink($path);
}
$line = ["test123", "Has logged in", "1397921556", "1"];
for ($i = 0; $i < 10; $i++) {
writeLogCSV($path, $line);
}
readLogCsv($path);
Good parts:
easy to read and write
Problems:
scaling
concurrency
no easy filtering
Store your log in database or use logging service, output JSON
Good parts
no concurrency issues
easy filtering
speed
good for scaling
<?php
ini_set('max_execution_time', 864000);
$seq = "D:/Ractip/Sequence.txt";
$mir = "D:/Ractip/mirhominid.txt";
$shandle = fopen($seq, 'r');
$sdata = fread($shandle, filesize($seq));
$mhandle = fopen($mir, 'r');
$mdata = fread($mhandle, filesize($mir));
$sexp = explode(">", $sdata);
$mexp = explode(">", $mdata);
$i = 1;
$a = 1;
$count = count($sexp);
while($i < $count)
{
$name = explode("\n", $mexp[$a]);
$name = explode(" ", $name[0]);
$name1 = explode("\n", $sexp[$i]);
$file2 = "D:\Ractip\mir\\"."$name[1]".".txt";
$file1 = "D:\Ractip\sequence\\"."name1[0]".".txt";
if ($i == 1){
mkdir("D:/Ractip/Interactions/"."$name[1]", 0777);
}
$file = "D:/Ractip/Interactions/"."$name[1]"."/"."$name1[0]"."+"."$name[1]".".txt";
$fhandle = fopen($file, 'w');
$query = "ractip "."$file1"." "."$file2";
$exec = shell_exec($query);
print $exec;
fwrite($fhandle, $exec);
fclose($fhandle);
if ($i == $count){
$i = 1;
$a++;
}else{
$i++;
}
}
?>
This is the script. I am basically using a tool to get results of roughly 37.5 million combinations, so as you can understand it isn't something I can do on my own, therefore came along this script, previously I separated all candidates into individual files and so that is the explanation for the $name variables I'm calling them that way.
The problem is the shell_exec command, a preliminary Google search really did not explain why it is behaving this way, but shell_exec refuses to process dynamic commands, instead if I were to make a static command like ractip xy.txt zy.txt it will process that, what I need to do is build the command and then make the shell_exec process it, which it unfortunately isn't doing, it would be really helpful if someone can explain why this command behaves this way and if there is a workaround to this glitch.
I've finally gotten around to understanding what a guy on a forum meant when he said that these are just some things php doesn't do very well.
Oh yes, and I am deploying it through the browser, dunno if that is any help.
On both Windows and Linux, you'll be better off by keeping all slashes like "/".
Also, looks like you forgot a $ in $file1:
$file2 = "D:/Ractip/mir/" . $name[1] . '.txt';
$file1 = "D:/Ractip/sequence/" . $name1[0] . ".txt";
Finally, just in case, for clarity, I'd write
$query = "ractip '$file1' '$file2'";
or
$query = 'ractip ' . $file1 . ' ' . $file2 ;
You don't really need to quote a single string variable, i.e. $string and "$string" are the same thing. I did quote $file1 and $file2 with single quotes /inside/ $query, because, if the names contain spaces, the ractip utility would get confused as to where one filename stops and another starts. Maybe it's not your case here, but anyway...
What I observed in your code is that in the file names you are passing, the slashes are not properly escaped:
$file2 = "D:\\Ractip\\mir\\"."$name[1]".".txt";
$file1 = "D:\\Ractip\\sequence\\"."name1[0]".".txt";
This might be causing the command to search for a wrong file
I made a script a while ago that wrote to a file, I did the same thing here, only added a part to read the file and write it again. What I am trying to achive is quite simple, but the problem is eluding me, I am trying to make my script write to a file basically holding the following information
views:{viewcount}
date-last-visited:{MM/DD/YYYY}
last-ip:{IP-Adress}
Now I have done a bit of research, and tried several methods to reading the data, none have returned anything. My current code is as follows.
<?php
$filemade = 0;
if(!file_exists("stats")){
if(!mkdir("stats")){
exit();
}
$filemade = 1;
}
echo $filemade;
$hwrite = fopen("stats/statistics.txt", 'w');
$icount = 0;
if(filemade == 0){
$data0 = file_get_contents("stats/statistics.txt");
$data2 = explode("\n", $data0);
$data1 = $data_1[0];
$ccount = explode(":", data1);
$icount = $ccount[1] + 1;
echo "<br>icount:".$icount."<br>";
echo "data1:".$data1."<br>";
echo "ccount:".$ccount."<br>";
echo "ccount[0]:".$ccount1[0]."<br>";
echo "ccount[1]:".$ccount1[1]."<br>";
}
$date = getdate();
$ip=#$REMOTE_ADDR;
fwrite($hwrite, "views:" . $icount . "\nlast-viewed:" . $date[5] . "/" . $date[3] . $date[2] . "/" . $date[6] . "\nlast-ip:" . $ip);
fclose($hwrite);
?>
the result is always:
views:1
last-viewed://
last-ip:
the views never go up, the date never works, and the IP address never shows.
I have looked at many sources before finally deciding to ask, I figured I'd get more relevant information this way.
Looking forward to some replies. PHP is my newest language, and so I don't know much.
What I have tried.
I have tried:
$handle_read = fopen("stats/statistics.txt", "r");//make a new file handle in read mode
$data = fgets($handle_read);//get first line
$data_array = explode(":", $data);//split first line by ":"
$current_count = $data_array[1];//get second item, the value
and
$handle_read = fopen("stats/statistics.txt", "r");//make a new file handle in read mode
$pre_data = fread($handle_read, filesize($handle_read));//read all the file data
$pre_data_array = explode("\n", $pre_data);//split the file by lines
$data = pre_data_array[0];//get first line
$data_array = explode(":", $data);//split first line by ":"
$current_count = $data_array[1];//get second item, the value
I have also tried split instead of explode, but I was told split is deprecated and explode is up-to-date.
Any help would be great, thank you for your time.
Try the following:
<?php
if(!file_exists("stats")){
if(!mkdir("stats")) die("Could not create folder");
}
// file() returns an array of file contents or false
$data = file("stats/statistics.txt", FILE_IGNORE_NEW_LINES | FILE_SKIP_EMPTY_LINES);
if(!$data){
if(!touch("stats/statistics.txt")) die("Could not create file");
// Default Values
$data = array("views:0", "date-last-visited:01/01/2000", "last-ip:0.0.0.0");
}
// Update the data
foreach($data as $key => $val){
// Limit explode to 2 chunks because we could have
// IPv6 Addrs (e.x ::1)
$line = explode(':', $val, 2);
switch($key){
case 0:
$line[1]++;
break;
case 1:
$line[1] = date('m/d/Y');
break;
case 2:
$line[1] = $_SERVER['REMOTE_ADDR'];
break;
}
$data[$key] = implode(':', $line);
echo $data[$key]. "<br />";
}
// Write the data back into the file
if(!file_put_contents("stats/statistics.txt", implode(PHP_EOL, $data))) die("Could not write file");
?>
I have a question. I am in process of learning how to read/write files, but having little trouble trying to do both at the same time in same php script. I have a text file with words like this,
Richmond,Virginia
Seattle,Washington
Los Angeles,California
Dallas,Texas
Jacksonville,Florida
I wrote a code to sort them in order and this will display in sort order by City.
<?php
$file = file("states.txt");
sort($file);
for($i=0; $i<count($file); $i++)
{
$states = explode(",", $file[$i]);
echo $states[0], $states[1],"<br />";
}
?>
From this, how can I rewrite those sorted information back into the states.txt file?
The easiest way to write the contents of $file back to the file would be using file_put_contents in collaboration with implode.
file_put_contents("states.txt", implode($file));
Try using fopen and fwrite.
$fileWrite = fopen("filePah", "w");
for($i=0; $i<count($file); $i++)
{
fWrite($fileWrite, $file[i]);
}
fClose($fileWrite);
<?php
$file = file("states.txt");
sort($file);
$newContent = "";
for($i=0; $i<count($file); $i++)
{
$states = explode(",", $file[$i]);
$newContent .= $states[0] .', '. $states[1] . PHP_EOL;
}
file_put_contents('states.txt',$newContent);
?>
PHP: file_put_contents
Try something like this:
$fo = fopen("filename", "w");
$content = "";
for ($i = 0; $i < count($file); $i++) {
$states = explode(",", $file[$i]);
$content .= $states[0] . "," . $states[1] . "\n";
}
fwrite($fo, $content);
fclose($fo);
this is a little extended, however I thought it might be useful to smb. I have an m3u playlist and need only particular rows filtered, sorted and printed. Credits go to Devil:
<?php
//specify that the variable is of type array
$masiv = array();
//read the file
$file = '/home/zlobi/radio/pls/all.m3u';
$f = fopen($file, "r");
while ($line = fgets($f))
{
//skip rows with #EXT
if(strpos($line, "#EXT") !== false) continue;
$text = str_replace('.ogg', ' ', $line);
$text = str_replace('/home/zlobi/radio/',' ',$text);
//add the song as an element in an array
$masiv[] = $text;
}
$f = fclose($f);
//sort the array
sort($masiv);
//pass via the array, take each element and print it
foreach($masiv as $pesen)
print $pesen.'<br/>';
?>
masiv is array, pesen is song in Bulgarian :)
CAPital letters are sorted first.
Regads
Once you are done reading the file into the array by a call to file. You can open the file for writing by using the fopen function, write into the file using the fwrite and close the file handle using fclose:
<?php
$file = file("states.txt"); // read file into array.
$fh = fopen('states.txt','w') or die("..."); // open same file for writing.
sort($file);
for($i=0; $i<count($file); $i++)
{
$states = explode(",", $file[$i]);
echo $states[0], $states[1],"<br />";
fwrite($fh,"$states[0],$states[1] <br />"); // write to file.
}
fclose($fh); // close file.
?>
Open the file, write to it, close it (this assumes $file is the variable from your code):
$fp = fopen('states.txt', 'w');
for($i=0; $i<count($file); $i++)
fwrite($fp, $file[$i]);
}
fclose($fp);
And see http://php.net/manual/en/function.fwrite.php
This is by far the fastest and most elegant solution that I have found, when I had the same problem.
If you're on Linux (with exec allowed in PHP configuration) you can do the following (provided you want to sort files numerically):
exec("sort -n " . $pathToOriginalFile . " > " . $pathToSortedFile);
Basically, execute bash command sort that sorts the lines in a file numerically.
If you want to keep the data in the original file do this:
exec("sort -n " . $pathToOriginalFile . " > " . $pathToSortedFile);
exec("rm " . $pathToOriginalFile);
exec("mv " . $pathToSortedFile . " " . $pathToOriginalFile);
If you want an alphabetical sort just exclude -n (--numeric-sort) option.
exec("sort " . $pathToOriginalFile . " > " . $pathToSortedFile);
For me the command took about 3 seconds to sort 10 million lines in the file on the server.
You can find more about sort here http://www.computerhope.com/unix/usort.htm
Hope it helps.