I'm developing an app where user upload excel [.xlsx] file for dumping data into MySQL database. I have programmed in such a way that there is a LOG created for each import. So that user can see if there is any error occurred and etc.. My script was working perfectly before implementing the log system.
After implementing the log system i can see duplicate rows inserted into database. Also die() command is not working.
It just keep looping continuously!
I have written sample code below. Please tell whats wrong in my logging method.
Note: if i remove logging [Writing into file] script works correctly.
$file = fopen("20131105.txt", "a");
fwrite($file, "LOG CREATED".PHP_EOL);
foreach($hdr as $k => $v) {
$username = $v['un'];
$address = $v['adr'];
$message = $v['msg'];
if($username == '') {
fwrite($file, 'Error: Missing User Name'.PHP_EOL);
continue;
} else {
// insert into database
}
}
fwrite($file, PHP_EOL."LOG CLOSED");
fclose($file);
echo 1;
die();
First, your die statement is after your loop. It needs to be inside your loop to end it;
Second, you're looping over $hdr. It's not defined in your snippet tho. It has to be an array. What does it contain?
var_dump($hdr);
The documentation for foreach as given in php manual highlights
"Reference of a $value and the last array element remain even after the foreach loop. It is recommended to destroy it by unset()."[1].
Try unsetting the values in foreach using unset($value) . This might be the reason for duplicate values.
Related
I am using league/csv to parse a csv file and then later dumping those data to the database.
The structure looks like:
$csv = Reader::createFromPath($csv_file_path, 'r');
$csv->setOutputBOM(Reader::BOM_UTF8);
$csv->addStreamFilter('convert.iconv.ISO-8859-15/UTF-8');
$csv->setHeaderOffset(0);
$csv_header = $csv->getHeader();
$loop = true;
while($loop){
$stmt = (new Statement())
->offset($offset)
->limit($limit)
;
$records = $stmt->process($csv);
foreach ($records as $record)
{
$rec_arr[] = array_values($record);
}
$records_arr = $service->trimArray($rec_arr);
if(count($records_arr)>0)
{
foreach($records_arr as $ck => $cv){
//map data and insert into database
}
}else{
$loop = false;
}
}
Currently, I am implementing this logic inside a laravel queue. It is successfully inserting the whole set of data but it is not halting the process.
It keeps getting stuck with message processing. However, if I removed that while loop then it will be stopped with message processed.
So, I think it should be something that I am implementing some bad logic there.
Looking for an idea to tackle with this.
if(count($records_arr)>0)
This line probably evaluates to true always.
Your code never reaches the $loop = false; end condition.
#stuart thanks for your comment. It was because I had working loop previously which used to work with multiple ajax requests. However, now with queue too, I had placed records, rec_arr outside of loop. Here, I placed this array initialization inside while loop and it works perfectly fine.
I have a script that reads about 10k lines of a txt file with an email address in each line.
For each address I do a check if its already in use and if yes I put the address into an error array, if not I put the address int an save array.
After the foreach and if no addreses are in the error array, I do a $this->Newsletter->saveMany($data).
For some reason, I always get a timeout, when I import more than about 500 lines.
Is there a way to go another / better way to avoid the timeout?
Please advice!
public function import() {
$filename = './files/newsletterImport/newsletter.txt';
$lines = file($filename);
foreach ($lines as $line_num => $line) {
// check unique
if($email = $this->Newsletter->find('first', array('conditions' => array('email' => trim($line))))){
$error[$line_num]['email'] = trim($line);
$error[$line_num]['cancel'] = date('d.m.Y H:i:s', strtotime($email['Newsletter']['cancel']));
}else{
$data[$line_num]['Newsletter']['email'] = trim($line);
$data[$line_num]['Newsletter']['active'] = 1;
}
}
if(!$error){
$this->Newsletter->create();
if($this->Newsletter->saveMany($data)){
$this->set('msg', 'Success');
}else{
$this->set('msg', 'Error! Nothing imported!');
}
}else{
$this->set('msg', 'Error! Nothing imported!');
$this->set('error', $error);
}
}else{
$this->set('msg', 'No file found!');
}
Since you don't really know when you time out, a few tips.
1) Meassure the time for every bit of code, unless you know what is the part that's taking you most of the time available.
2) You do a query to the DB each for each line, that's 10k worth of queries... My heart hurts. So, consider doing a big query to get all emails, and compare them with php to know if they are unique. Again, time it, maybe the SQL option turns out to be more efficient (depends on how you do the search on a PHP array)
3) Try adjusting the options of the save. The atomic option is true by default, and that probably is a bit too much to a 10k save.
$this->Newsletter->saveMany($data, array('atomic'=>false));
If that those not solve the problem, try to divide the array and save it by stages. I know it a pain, but maybe the transaction is just too much.
I have the following PHP code that checks which choice from a radio button was selected and then write to a file of the same name.
For example, from a Radio button group called "instrument", where the 4 choices are
Wind
Strings
Percussion
Vocal
If the user selects "Wind", then it would create and write to a file called "wind_instrument.txt". If "strings" is selected it would create the file "string_instrument.txt" and so on.
Here is my PHP code:
if ($_POST['instrument'] == "wind")
{
$lines = file('wind_instrument.txt');
$fopen = fopen("wind_instrument.txt", "w+");
}
elseif ($_POST['instrument'] == "strings")
{
$lines = file('strings_instrument.txt');
$fopen = fopen("strings_instrument.txt", "w+");
}
elseif ($_POST['instrument'] == "percussion")
{
$lines = file('percussion_instrument.txt');
$fopen = fopen("percussion_instrument.txt", "w+");
}
elseif ($_POST['instrument'] == "vocal")
{
$lines = file('vocal_instrument.txt');
$fopen = fopen("vocal_instrument.txt", "w+");
}
Now, if one of the conditions is met, would then go on to the next step in my code, being:
fwrite($fopen, ("Instrument: ")."");
fwrite($fopen, $_POST["instrument"]."\n");
fwrite($fopen, ("<br>")."\n");
The problem I have with this, is that it is not creating a file, and I do have permissions set.
Any help will be greatly appreciated, thank you.
You actually could do some refactoring in order to make it easier to maintain, nevertheless that wasn't your problem but I shall try to help you out.
<?php
$instruments = array('wind', 'strings', 'percussion', 'vocal');
if (in_array($_POST['instrument'], $instruments))
{
$instrument = $_POST['instrument'];
$file_handle = fopen($instrument.'_instrument.txt', 'a+');
$line = 'Instrument: '.$instrument."\n";
fwrite($file_handle, $line);
}
?>
The important thing to know is how I open the file. I use the mode a+. The documentation says
Open for reading and writing; place the file pointer at the end of the file. If the file does not exist, attempt to create it.
Hope that helps.
If you've verified that you have permissions to open and write to a file, then there should be no problem with doing it based on a conditional. I suggest checking the contents of $_POST and making sure that instrument is present and meets one of your conditions. Alternately, you could add an else clause that will write the submission to an error file if no valid instrument was received. If that works, it would confirm that the problem is with the POSTed variable, not with fopen/fwrite.
If the options you provided in the bulleted list are the literal values of your radio buttons, then your problem is that they're capitalized and the values you test in the if statement aren't. Either capitalize them consistently, or use strtolower() to convert everything to a consistent case before comparing.
I have been struggling to create a Simple ( really simple ) chat system for my website as my knowledge on Javascripting/AJAX are Limited after gather resources and help from many kind people I was able to create my simple chat system but left with one problem.
The messages are posted to a file called "msg.html" in this format :
<p><span id="name">$name</span><span id="Msg">$message</span></p>
And then using PHP and AJAX I will retrieve the messages instantly from the file using the
file(); function and a foreach(){} loop withing PHP here is the code :
<?php
$file = 'msg.html';
$data = file($file);
$max_lines = 20;
if(count($data) > $max_lines){
// here i want the data to be deleted from oldest until i only have 20 messages left.
}
foreach($data as $line_num => $line){
echo $line_num . " . " . $line;
}
?>
My Question is how can i delete the oldest messages so that i am only left with the latest 20 Messages ?
How does something like this seem to you:
$file = 'msg.html';
$data = file($file);
$max_lines = 20;
foreach($data as $line_num => $line)
{
if ($line_num < $max_lines)
{
echo $line_num . " . " . $line;
}
else
{
unset($data[$line_num]);
}
}
file_put_contents('msg.html', $data);
?>
http://www.php.net/manual/en/function.file-put-contents.php for more info :)
I suppose you can read the file, explode it into an array, chop off everything but last 20 fields and write it back to file, overwriting the old one... Perhaps not the best solution but one that comes to mind if you really cant use database as Delan suggested
That's called round-robin if I recall correctly.
As far as I know, you can't remove arbitrary portions of a file. You need to overwrite the file with the new contents (or create a new file and remove the old one). You could also store messages in individual files but of course that implies up to $max_lines files to read.
You should also use flock() to avoid data corruption. Depending on the platform it's not 100% reliable but it's better than nothing.
I'm working on a project for a client - a wordpress plugin that creates and maintains a database of organization members. I'll note that this plugin creates a new table within the wordpress database (instead of dealing with the data as custom_post_type meta data). I've made a lot of modifications to much of the plugin, but I'm having an issue with a feature (that I've left unchanged).
One half of this feature does a csv import and insert, and that works great. The other half of this sequence is a feature to download the contents of this table as a csv. This part works fine on my local system, but fails when running from the server. I've poured over each portion of this script and everything seems to make sense. I'm, frankly, at a loss as to why it's failing.
The php file that contains the logic is simply linked to. The file:
<?php
// initiate wordpress
include('../../../wp-blog-header.php');
// phpinfo();
function fputcsv4($fh, $arr) {
$csv = "";
while (list($key, $val) = each($arr)) {
$val = str_replace('"', '""', $val);
$csv .= '"'.$val.'",';
}
$csv = substr($csv, 0, -1);
$csv .= "\n";
if (!#fwrite($fh, $csv))
return FALSE;
}
//get member info and column data
$table_name = $wpdb->prefix . "member_db";
$year = date ('Y');
$members = $wpdb->get_results("SELECT * FROM ".$table_name, ARRAY_A);
$columns = $wpdb->get_results("SHOW COLUMNS FROM ".$table_name, ARRAY_A);
// echo 'SQL: '.$sql.', RESULT: '.$result.'<br>';
//output headers
header("Content-type: application/octet-stream");
header("Content-Disposition: attachment; filename=\"members.csv\"");
//open output stream
$output = fopen("php://output",'w');
//output column headings
$data[0] = "ID";
$i = 1;
foreach ($columns as $column){
//DIAG: echo '<pre>'; print_r($column); echo '</pre>';
$field_name = '';
$words = explode("_", $column['Field']);
foreach ($words as $word) $field_name .= $word.' ';
if ( $column['Field'] != 'id' && $column['Field'] != 'date_updated' ) {
$data[$i] = ucwords($field_name);
$i++;
}
}
$data[$i] = "Date Updated";
fputcsv4($output, $data);
//output data
foreach ($members as $member){
// echo '<pre>'; print_r($member); echo '</pre>';
$data[0] = $member['id'];
$i = 1;
foreach ($columns as $column){
//DIAG: echo '<pre>'; print_r($column); echo '</pre>';
if ( $column['Field'] != 'id' && $column['Field'] != 'date_updated' ) {
$data[$i] = $member[$column['Field']];
$i++;
}
}
$data[$i] = $member['date_updated'];
//echo '<pre>'; print_r($data); echo '</pre>';
fputcsv4($output, $data);
}
fclose($output);
?>
So, obviously, a routine wherein a query is run, $output is established with fopen, each row is then formatted as comma delimited and fwrited, and finally the file is fclosed where it gets pushed to a local system.
The error that I'm getting (from the server) is
Error 6 (net::ERR_FILE_NOT_FOUND): The file or directory could not be found.
But it clearly is getting found, its just failing. If I enable phpinfo() (PHP Version 5.2.17) at the top of the file, I definitely get a response - notably Cannot modify header information (I'm pretty sure because phpinfo() has already generated a header). All the expected data does get printed to the bottom of the page (after all the phpinfo diagnostics), however, so that much at least is working correctly.
I am guessing there is something preventing the fopen, fwrite, or fclose functions from working properly (a server setting?), but I don't have enough experience with this to identify exactly what the problem is.
I'll note again that this works exactly as expected in my test environment (localhost/XAMPP, netbeans).
Any thoughts would be most appreciated.
update
Ok - spent some more time with this today. I've tried each of the suggested fixes, including #Rudu's writeCSVLine fix and #Fernando Costa's file_put_contents() recommendation. The fact is, they all work locally. Either just echoing or the fopen,fwrite,fclose routine, doesn't matter, works great.
What does seem to be a problem is the inclusion of the wp-blog-header.php at the start of the file and then the additional header() calls. (The path is definitely correct on the server, btw.)
If I comment out the include, I get a csv file downloaded with some errors planted in it (because $wpdb doesn't exist. And if comment out the headers, I get all my data printed to the page.
So... any ideas what could be going on here?
Some obvious conflict of the wordpress environment and the proper creation of a file.
Learning a lot, but no closer to an answer... Thinking I may need to just avoid the wordpress stuff and do a manual sql query.
Ok so I'm wondering why you've taken this approach. Nothing wrong with php://output but all it does is allow you to write to the output buffer the same way as print and echo... if you're having trouble with it, just use print or echo :) Any optimizations you could have got from using fwrite on the stream then gets lost by you string-building the $csv variable and then writing that in one go to the output stream (Not that optimizations are particularly necessary). All that in mind my solution (in keeping with your original design) would be this:
function escapeCSVcell($val) {
return str_replace('"','""',$val);
//What about new lines in values? Perhaps not relevant to your
// data but they'll mess up your output ;)
}
function writeCSVLine($arr) {
$first=true;
foreach ($arr as $v) {
if (!$first) {echo ",";}
$first=false;
echo "\"".escapeCSVcell($v)."\"";
}
echo "\n"; // May want to use \r\n depending on consuming script
}
Now use writeCSVLine in place of fputcsv4.
Ran into this same issue. Stumbled upon this thread which does the same thing but hooks into the 'plugins_loaded' action and exports the CSV then. https://wordpress.stackexchange.com/questions/3480/how-can-i-force-a-file-download-in-the-wordpress-backend
Exporting the CSV early eliminates the risk of the headers already being modified before you get to them.