I have a csv file I need to cleanup. It contains 13 fields, but I only need 7 (Business, Address, City, St, Zip, Phone, Email)
I need to run through all of the records and create a new output of just the records with email addresses.
In nutshell... I load the original file, run the for loop, explode the results, then look for the records where the $tmp[10] index is not null. I then get the rest of the rest of required fields, and do a foreach loop and fwrite the results to a new csv file.
Depending on how I tweak the code, I get either...
A text file of just email addresses.
or
A text file of just the last record with an email address.
I have been working on this too long and I just need a fresh set of eyes to point out the problem. I am new to php, and want to make this work. Thanks on advance.
<?php
// See if file exists and is readable
$file = 'uploads/AK_Accountants.csv';
$newfile = basename($file,".csv");
$newfile = $newfile.Date("Ymd").".csw";
$fileNew = fopen('uploads/AK_'.Date("Ymd").'.csv','w+');
// Read the file into an array called $sourcefile
$sourcefile = file($file);
// Loop through the array and process each line
for ($i = 0; $i < count($sourcefile); $i++) {
// Separate each element and store in a temp array
$tmp = explode('","', $sourcefile[$i]);
// Assign each element of the temp array to a named array key
if ($tmp[10] != "") {
$sourcefile[$i] = array('Business_Name' => $tmp[1], 'Address' => $tmp[3], 'City' => $tmp[4], 'State' => $tmp[5], 'Zip' => $tmp[6], 'Phone' => $tmp[7], 'Email' => $tmp[10]);
foreach($sourcefile[$i] as $key => $value);
fwrite($fileNew, $value);
}
}
?>
From a quick glance:
foreach($sourcefile[$i] as $key => $value);
fwrite($fileNew, $value);
should be
foreach($sourcefile[$i] as $key => $value){
fwrite($fileNew, $value);
}
Also, you have
$newfile = $newfile.Date("Ymd").".csw";
rather than what I assume should be
$newfile = $newfile.Date("Ymd").".csv";
Your last foreach statement is terminated by a ';' and has no code block. Once the foreach statement has finished iterating you'll get the last value written to file i.e. just the email address.
You currently have
foreach (... ) ;
fwrite(...);.
but you probably mean
foreach( ... ) {
fwrite(...) ;
}
Been there, done that :)
HTH
Related
I'm working with a pair of PHP scripts. One script reads data from a MYSQL database and exports it to a csv file then a second script uploads the exported csv file to another MySQL database instance using csv. The structure of the database tables A (export) and B (import) are identical.
These scripts work fine for "normal" MySQL tables and column types. However, the import fails when we apply them to a MySQL table that stores a JSON object in one of the columns (MySQL column type is "json").
The script that exports the data works as expected, producing a CSV file with the JSON object surrounded by double quotes...just like the other values in the row.
The row in the exported CSV file looks like this (the last item is a complex json object, abbreviated for simplicity):
"894","Somebody","Related","2020-02-20","{"name1":"value1","name2":"value2","name3":"value3"}","expired"
In the PHP script to export the data it's essentially this:
$rowStr = "894","Somebody","Related","2020-02-20","{"name1":"value1","name2":"value2","name3":"value3"}","expired";
file_put_contents($filepath, trim($rowStr), FILE_APPEND);
No issues with the export. Row appears in the CSV file as expected (same format as above).
My code to read the csv into the other database looks like this:
$allRows = array_map('str_getcsv',file($fp)); // read the exported csv file where $fp is the path to the file
foreach($allRows as $i => $row) {
//get the col_names from the 2nd database table (identical to the first) where $ac-> is the class that handles data queries
$col_names = $ac->get_table_column_names('databasename',$tablename);
$update_arr = array();
foreach($col_names as $i => $cname) {
$update_arr[$cname['COLUMN_NAME']] = $val;
}
//and write the row to the 2nd db's table
$ac->create_update_table($update_arr,$tablename,FALSE);
}
And, if it matters, here are the Queries used in the "get_table_column_names" and "create_update_table" functions:
get_table_column_names //Using PDO
SELECT COLUMN_NAME,COLUMN_DEFAULT,DATA_TYPE FROM information_schema.columns WHERE table_schema = :db AND table_name = :table
create_update_table
INSERT INTO 'tablename' (field1, field2, field3, field4,json_object_column) VALUES ("894","Somebody","Related","2020-02-20","{"name1":"value1","name2":"value2","name3":"value3"}")
The problem is that, when importing, the row is converted to an array like this:
array (
[0] = "894",
[1] = "Somebody",
[2] = "Related",
[3] = "2020-02-20",
[4] = "{name1":"value1",
[5] = "name2:"value2", //should be part of node 4
[6] = "name3:"value3"}", //should be part of node 4
[7] = "expired"
);
What's happening is that the "," inside the JSON object is being treated as a field separator and the JSON is broken up into array nodes. Other than writing a script to detect fields that start with "{ and end with }", how can I read the entire json string as one field (as it is in the database)? or, perhaps, is there a better way to output the string so that it can be read as one item?
If instead of just writing out the data using something like file_put_contents() you use some of the methods designed for CSV files, this will do most of the work for you...
To write the data use fputcsv() and this escapes the delimiter (in this case the " becomes "")...
$row = ["894","Somebody","Related","2020-02-20",'{"name1":"value1","name2":"value2","name3":"value3"}',"expired"];
$fh = fopen($filepath, "a");
fputcsv($fh, $row);
fclose($fh);
which will write to the file
894,Somebody,Related,2020-02-20,"{""name1"":""value1"",""name2"":""value2"",""name3"":""value3""}",expired
and then to read from the file, just read a line at a time and use fgetcsv()...
$fh = fopen($filepath, "r");
print_r(fgetcsv($fh)); // This in a loop to read all lines
fclose($fh);
which shows
Array
(
[0] => 894
[1] => Somebody
[2] => Related
[3] => 2020-02-20
[4] => {"name1":"value1","name2":"value2","name3":"value3"}
[5] => expired
)
One way of solving this is to create a new copy of the array and manipulate the new array
and add json as a sliced part of the original array.
$allRows = array_map('str_getcsv',file($fp));
$new_arr = [];
foreach($allRows[0] as $key=>$item) {
$json = false;
if (substr($item,0,1) == '{') {
$json_start = $key;
$json = true;
}
if (substr($item,-2,2) == '}"') {
$json_stop = $key;
$json = true;
//Slice json-part from original array (in your case 4,5,6)
$sl = array_slice($allRows[0], $json_start, ($json_stop-$json_start)+1);
//Add the sliced part into key where json started
$new_arr[$json_start] = implode('',$sl);
}
if ($json === false) $new_arr[] = $item;
}
And then you have your expected array in $new_arr.
I have huge text file and I am trying to read and insert this line by line.
this is txt file data.
'REG','KOIL','Kohinoor Industries Ltd.','READY',4.82,2.82,3.82
'REG','EPQL','Engro Powergen Qadirpur Ltd.','READY',36.9495,33.4305,35.19
Function for insert data
$file_path =FCPATH.'uploads/text/'.$file_name;
$psx_date=$this->input->post('file_date');
$open=fopen($file_path,"r");
$i=1;
while(!feof($open)){
$line=fgets($open);
if($i>2){
$values = explode(",",$line);
$psx_symbol=str_replace('\'',null,$values[1]);
$no_of_rows=read_psx_where($psx_symbol,$psx_date);
if($no_of_rows<=0){
$psx_data=array(
'PSX_SYMBOL' => $psx_symbol,
'PSX_DATE' => $psx_date,
'PSX_HIGH' => $values[4],
'PSX_LOW' => $values[5],
'PSX_CLOSE' => $values[6],
'PSX_DATETIME' => date('Y-m-d H:i:s'),
'PSX_SATUS' => 1
);
insert_psx_data($psx_data);
}
}
$i++;
}
fclose($open);
I am first skip first two lines of test file and then I am checking if same symbol is already exist so then skip this line.
This method is working but too much slowdown and exceeding max exectution time.
From wp-config.php file I need to get DB name, username, password values without including the wp-config file and assign them to three separate variable for further use.
define('DB_NAME', 'somedb');
define('DB_USER', 'someuser');
define('DB_PASSWORD', 'somepass');
my script will be in the same folder. No I don't want to use any WordPress functions.
If you really don't want to include the file, as mentioned in the comments already,
we can read the file contents into an array with file().
The iterate over each line and apply some cleanup, until we get to a format we can work with:
<?php
$config = file('wp-config.php');
$dbLines = [];
foreach($config as $line){
if (stristr($line,"define('DB_")!==FALSE){
$dbLines[] = $line;
}
}
array_walk($dbLines, 'cleanup');
// apply the cleanup() function to all members of the array, basically to each line
function cleanup(&$value){
// replace the leading 'define(' and trailing closing bracket
$value = str_replace(
array("define(",");"),'',$value
);
$value = preg_replace('#\s+//(.*)#','',$value); // remove the comments
}
// at this point we have a csv structure with a single quote as the delimiter
// comma+space as a separator
$dbConfig = [];
foreach ($dbLines as $dbline){
// read the line into separate variables and build an array
list($key,$value) = (str_getcsv($dbline,', ',"'"));
$dbConfig[$key] = $value;
}
print_r($dbConfig);
This will output
Array
(
[DB_NAME] => putyourdbnamehere
[DB_USER] => usernamehere
[DB_PASSWORD] => yourpasswordhere
[DB_HOST] => localhost
[DB_CHARSET] => utf8
[DB_COLLATE] =>
)
If you want to access a single element from the array, just
print $dbConfig['DB_HOST'];
I have an issue with a processing script. I would like to allow 2 duplicate ip addresses maximum in a csv file, to prevent some spamming and to take into consideration that the user could make a mistake in form fill. I cant seem to reference the $ip variable correctly in the script, or there might be something I am missing altogether. Here is the code snippet thus far:
<?php
#VARIABLE DECLARATIONS (filename and post vars) GO HERE
$counter = 0;
if (file_exists($filename))
{
$file = fopen($filename, "a");
while($data = fgetcsv($filename)){
if(isset($data[$ip])){
$counter++;
continue;
if((isset($data[$ip])){
$counter++;
if($counter == 2){
echo "";
}
}
}
}
##file write goes here
}
?>
Any help on this would be appreciated,
Jim
You will have to read all the elements in an array first and only after you have the number of occurrences of each IP address ready, should go ahead with writing the file (a separate file may be?).
You can first prepare an array with IP indexes and all the rows corresponding to an IP as value attributes to that key.
This could be done -
$csvArray = str_getcsv(file_get_contents('myCSVFile.csv'));
foreach($csvArray as $eachRow)
{
//prepare IP indexed array with every detail row as an attribute of the corresponding IP key.
$properlyFormattedArray[$eachRow[5]][] = $eachRow;
}
You get an array like this -
Array(['92.27.21.171'] =>
[0] => Array("Mr","Test","davis","07972889989","01159174767","92.27.21.171"),
[1] => Array("Mr","bob","jones","07998998008","01159174767","92.27.21.171"),
...
['92.27.21.172'] => ...
)
Once you have this array, just loop over it, and write only at max 2 rows for every IP.
foreach($properlyFormattedArray as $ip => $details)
{
$count = 0;
foreach($details as $eachDetail)
{
if($count<2)
{
//write $eachDetail into file
$count++;
}
}
}
But, in this case, the order of data (compared with your input file) will be changed, and the duplicates will be written in consecutive rows in the csv file (not sure if you would be okay with it).
still struggling with PHP and CSV file manipulation. I will try to ask this properly so I can get some help.
I have a CSV file with about 4000 lines/rows, and I wrote this code to create an array of the entire CSV, and pull the the LAST line of the CSV file out to use in my script. The code works to to all this wit success.
// CREATE ASSOCIATIVE ARRAY FROM LAST ROW OF CSV FILE
$csv = array();
if (FALSE !== $handle = fopen("Alabama-TEST.csv", "r"))
{
while (FALSE !== $row = fgetcsv($handle))
{
$csv[] = $row;
}
}
$new_csv = array();
foreach ($csv as $row)
{
$new_row = array();
for ($i = 0, $n = count($csv[0]); $i < $n; ++$i)
{
$new_row[$csv[0][$i]] = $row[$i];
}
$new_csv[] = $new_row;
}
The variable $new_row is the last row in the CSV and I am able to use the data fine. But when the script is finished running I want to delete this last row called $new_row.
Here is an example of my CSV file at the top view;
CITY ADDRESS STATE NAME
Birmingham 123 Park St. Alabama Franky
So I just want to remove the last row and keep the header at the top, for the next time the script runs. I've been trying for 3 days solid trying to figure this out, so I'm hoping some kind person who KNOWS WHAT THEY ARE DOING can help.
Here is the code I have tried to remove the last row;
$inp = file('Alabama-TEST.csv');
$out = fopen('Alabama-TEST.csv','w');
or ($i=0;$i<count($inp)-1;$i++)
fwrite($out,$inp[$i]);
fclose($out);
Since I'm using a large file at around 4000 rows, is there a way to do this without using too much memory?
You need to use array_pop on your array ($new_csv) and fputcsv to save the file:
array_pop($new_csv); // Removes the last element in the array
$out = fopen('Alabama-TEST-new.csv','w');
foreach ($new_csv as $row) { // sorry for writing it badly, try now
fputcsv($out, $row);
}
fclose($out);
I don't have an environment to test this, sorry.