How to prepare csv file so I could use it for running queries.. Do you have any ideas or maybe links that could be helpful
I have mysql database and i run certain sql query on this database lets say "select * from some_database where id=1", and thats everything clear and ok, but, now, i export csv from database and i want to open in and read with php and to sort datas similar as in mysql query. Can someone help me and give me some ideas, points in what way to go
<?php
$csv =<<<CSV
23, Foo, Bar, foo#example.com
47, Baz, Qux, baz#example.com
CSV;
$items = explode("\n", $csv);
$items = array_map('str_getcsv', $items);
$get_by_id = function ($id) use ($items) {
foreach($items as $item)
if($item[0] == $id)
return $item;
};
var_export($get_by_id('47'));
Output:
array (
0 => '47',
1 => ' Baz',
2 => ' Qux',
3 => ' baz#example.com',
)
Related
I'm working with a pair of PHP scripts. One script reads data from a MYSQL database and exports it to a csv file then a second script uploads the exported csv file to another MySQL database instance using csv. The structure of the database tables A (export) and B (import) are identical.
These scripts work fine for "normal" MySQL tables and column types. However, the import fails when we apply them to a MySQL table that stores a JSON object in one of the columns (MySQL column type is "json").
The script that exports the data works as expected, producing a CSV file with the JSON object surrounded by double quotes...just like the other values in the row.
The row in the exported CSV file looks like this (the last item is a complex json object, abbreviated for simplicity):
"894","Somebody","Related","2020-02-20","{"name1":"value1","name2":"value2","name3":"value3"}","expired"
In the PHP script to export the data it's essentially this:
$rowStr = "894","Somebody","Related","2020-02-20","{"name1":"value1","name2":"value2","name3":"value3"}","expired";
file_put_contents($filepath, trim($rowStr), FILE_APPEND);
No issues with the export. Row appears in the CSV file as expected (same format as above).
My code to read the csv into the other database looks like this:
$allRows = array_map('str_getcsv',file($fp)); // read the exported csv file where $fp is the path to the file
foreach($allRows as $i => $row) {
//get the col_names from the 2nd database table (identical to the first) where $ac-> is the class that handles data queries
$col_names = $ac->get_table_column_names('databasename',$tablename);
$update_arr = array();
foreach($col_names as $i => $cname) {
$update_arr[$cname['COLUMN_NAME']] = $val;
}
//and write the row to the 2nd db's table
$ac->create_update_table($update_arr,$tablename,FALSE);
}
And, if it matters, here are the Queries used in the "get_table_column_names" and "create_update_table" functions:
get_table_column_names //Using PDO
SELECT COLUMN_NAME,COLUMN_DEFAULT,DATA_TYPE FROM information_schema.columns WHERE table_schema = :db AND table_name = :table
create_update_table
INSERT INTO 'tablename' (field1, field2, field3, field4,json_object_column) VALUES ("894","Somebody","Related","2020-02-20","{"name1":"value1","name2":"value2","name3":"value3"}")
The problem is that, when importing, the row is converted to an array like this:
array (
[0] = "894",
[1] = "Somebody",
[2] = "Related",
[3] = "2020-02-20",
[4] = "{name1":"value1",
[5] = "name2:"value2", //should be part of node 4
[6] = "name3:"value3"}", //should be part of node 4
[7] = "expired"
);
What's happening is that the "," inside the JSON object is being treated as a field separator and the JSON is broken up into array nodes. Other than writing a script to detect fields that start with "{ and end with }", how can I read the entire json string as one field (as it is in the database)? or, perhaps, is there a better way to output the string so that it can be read as one item?
If instead of just writing out the data using something like file_put_contents() you use some of the methods designed for CSV files, this will do most of the work for you...
To write the data use fputcsv() and this escapes the delimiter (in this case the " becomes "")...
$row = ["894","Somebody","Related","2020-02-20",'{"name1":"value1","name2":"value2","name3":"value3"}',"expired"];
$fh = fopen($filepath, "a");
fputcsv($fh, $row);
fclose($fh);
which will write to the file
894,Somebody,Related,2020-02-20,"{""name1"":""value1"",""name2"":""value2"",""name3"":""value3""}",expired
and then to read from the file, just read a line at a time and use fgetcsv()...
$fh = fopen($filepath, "r");
print_r(fgetcsv($fh)); // This in a loop to read all lines
fclose($fh);
which shows
Array
(
[0] => 894
[1] => Somebody
[2] => Related
[3] => 2020-02-20
[4] => {"name1":"value1","name2":"value2","name3":"value3"}
[5] => expired
)
One way of solving this is to create a new copy of the array and manipulate the new array
and add json as a sliced part of the original array.
$allRows = array_map('str_getcsv',file($fp));
$new_arr = [];
foreach($allRows[0] as $key=>$item) {
$json = false;
if (substr($item,0,1) == '{') {
$json_start = $key;
$json = true;
}
if (substr($item,-2,2) == '}"') {
$json_stop = $key;
$json = true;
//Slice json-part from original array (in your case 4,5,6)
$sl = array_slice($allRows[0], $json_start, ($json_stop-$json_start)+1);
//Add the sliced part into key where json started
$new_arr[$json_start] = implode('',$sl);
}
if ($json === false) $new_arr[] = $item;
}
And then you have your expected array in $new_arr.
Background
I'm trying to complete a code challenge where I need to refactor a simple PHP application that accepts a JSON file of people, sorts them by registration date, and outputs them to a CSV file. The provided program is already functioning and works fine with a small input but intentionally fails with a large input. In order to complete the challenge, the program should be modified to be able to parse and sort a 100,000 record, 90MB file without running out of memory, like it does now.
In it's current state, the program uses file_get_contents(), followed by json_decode(), and then usort() to sort the items. This works fine with the small sample data file, however not with the large sample data file - it runs out of memory.
The input file
The file is in JSON format and contains 100,000 objects. Each object has a registered attribute (example value 2017-12-25 04:55:33) and this is how the records in the CSV file should be sorted, in ascending order.
My attempted solution
Currently, I've used the halaxa/json-machine package, and I'm able to iterate over each object in the file. For example
$people = \JsonMachine\JsonMachine::fromFile($fileName);
foreach ($people as $person) {
// do something
}
Reading the whole file into memory as a PHP array is not an option, as it takes up too much memory, so the only solution I've been able to come up with so far has been iterating over each object in the file, finding the person with the earliest registration date and printing that. Then, iterating over the whole file again, finding the next person with the earliest registration date and printing that etc.
The big issue with that is that the nested loops: a loop which runs 100,000 times containing a loop that runs 100,000 times. It's not a viable solution, and that's the furthest I've made it.
How can I parse, sort, and print to CSV, a JSON file with 100,000 records? Usage of packages / services is allowed.
I ended up importing into MongoDB in chunks and then retrieving in the correct order to print
Example import:
$collection = (new Client($uri))->collection->people;
$collection->drop();
$people = JsonMachine::fromFile($fileName);
$chunk = [];
$chunkSize = 5000;
$personNumber = 0;
foreach ($people as $person) {
$personNumber += 1;
$chunk[] = $person;
if ($personNumber % $chunkSize == 0) { // Chunk is full
$this->collection->insertMany($chunk);
$chunk = [];
}
}
// The very last chunk was not filled to the max, but we still need to import it
if(count($chunk)) {
$this->collection->insertMany($chunk);
}
// Create an index for quicker sorting
$this->collection->createIndex([ 'registered' => 1 ]);
Example retrieve:
$results = $this->collection->find([],
[
'sort' => ['registered' => 1],
]
);
// For every person...
foreach ($results as $person) {
// For every attribute...
foreach ($person as $key => $value) {
if($key != '_id') { // No need to include the new MongoDB ID
echo some_csv_encode_function($value) . ',';
}
}
echo PHP_EOL;
}
I have been assigned the following PHP task for Uni:
1) Export the Trades/Crafts table to Excel
2) Make three columns in Excel: Trade-ID, Category-ID, Trade-Name and organize it
3) Export the table as a .CSV file (easier for PHP manipulation)
4) Write a PHP script that creates a YAML file from CSV file that corresponds to the structure of AppBundle / Resources / fixtures / prod / trades.yml
How you do that exactly, is up to you. You are free to use whatever method you prefer, but the YAML file has to have
the right structure.
This is the trades.yml file (it was given as an example for us to follow):
-
ref: trade-1
id: 1
name: Plumber
category: $trade-category-1
-
ref: trade-16
id: 16
name: Electronic Engineer
category: $trade-category-2
So as you might have guessed, I have a table with those 3 columns (Trade-ID, Category-ID and Trade-Name). The table contains about 150 rows with the name of many different kind of jobs, the type/branch of the job as category ID, and the ID of the job itself.
I exported that Excel table as a .csv file, as ordered. Now, I think I should use some function like str_getcsv or fgetcsv, which, how I understand it, reads the data in the CSV table and converts it into PHP arrays. After that, I need to convert those arrays into the YAML syntax format, but I read that's not particularly difficult.
Anyways, the number of 'titles/entries' in the YAML structure (ref, id, name, category) does not equal the number of columns in the CSV table (trade name, cat id, trade id), so I don't know how or where I should even start!
Also, for the "ref" part,I guess I would have to make it look like this: trade-<ID of trade> , but how can I do something like "trade-" . $TradeID inside an array? How can declare $TradeID = <ID of the job>? Can I refer to the CSV table's row and column I want, like SELECT in SQL? Or should I maybe use a WHILE loop which fetches all of the table's rows?
I have tried it like this:
<?php
$file = fopen('MyTable.csv', 'r') or die('error');
$line = array();
while (($line = fgetcsv($file)) !== FALSE) {
//$line is an array of the csv elements
// print_r($line);
foreach ($line as $key => $value) {
# code...
}
}
fclose($file);
But it outputs something like this:
Array ( [0] => 1;1;Bituminiser; ) Array ( [0] => 1;2;Construction Dryer; ) Array ( [0] => 1;3;Concrete Driller and Cutter; ) Array ( [0] => 1;4;Concrete Block and Terrazzo Maker; ) Array ( [0] => 1;5;Well Builder; )
And so on... But I still have the other problem, plus the [0] inside each array.
I really can't figure it out. How can I convert the CSV table into a YAML file with the structure of trades.yml using PHP?
Try this:
fgetcsv($file, 0, ';')
And look for more information in PHP docs for fgetcsv
However, I've made a sample script for you:
<?php
$file = fopen('MyTable.csv', 'r') or die('error');
$yml = '';
$indent = str_repeat(' ', 4);
$keys = [
'ref',
'id',
'name',
'category',
];
while (($values = fgetcsv($file, 0, ';')) !== FALSE) {
$yml .= "-\n";
$arr = array_combine($keys, $values);
foreach ($arr as $key => $value) {
$yml .= "{$indent}{$key}: {$value}\n";
}
}
fclose($file);
echo $yml;
I have a csv file I need to cleanup. It contains 13 fields, but I only need 7 (Business, Address, City, St, Zip, Phone, Email)
I need to run through all of the records and create a new output of just the records with email addresses.
In nutshell... I load the original file, run the for loop, explode the results, then look for the records where the $tmp[10] index is not null. I then get the rest of the rest of required fields, and do a foreach loop and fwrite the results to a new csv file.
Depending on how I tweak the code, I get either...
A text file of just email addresses.
or
A text file of just the last record with an email address.
I have been working on this too long and I just need a fresh set of eyes to point out the problem. I am new to php, and want to make this work. Thanks on advance.
<?php
// See if file exists and is readable
$file = 'uploads/AK_Accountants.csv';
$newfile = basename($file,".csv");
$newfile = $newfile.Date("Ymd").".csw";
$fileNew = fopen('uploads/AK_'.Date("Ymd").'.csv','w+');
// Read the file into an array called $sourcefile
$sourcefile = file($file);
// Loop through the array and process each line
for ($i = 0; $i < count($sourcefile); $i++) {
// Separate each element and store in a temp array
$tmp = explode('","', $sourcefile[$i]);
// Assign each element of the temp array to a named array key
if ($tmp[10] != "") {
$sourcefile[$i] = array('Business_Name' => $tmp[1], 'Address' => $tmp[3], 'City' => $tmp[4], 'State' => $tmp[5], 'Zip' => $tmp[6], 'Phone' => $tmp[7], 'Email' => $tmp[10]);
foreach($sourcefile[$i] as $key => $value);
fwrite($fileNew, $value);
}
}
?>
From a quick glance:
foreach($sourcefile[$i] as $key => $value);
fwrite($fileNew, $value);
should be
foreach($sourcefile[$i] as $key => $value){
fwrite($fileNew, $value);
}
Also, you have
$newfile = $newfile.Date("Ymd").".csw";
rather than what I assume should be
$newfile = $newfile.Date("Ymd").".csv";
Your last foreach statement is terminated by a ';' and has no code block. Once the foreach statement has finished iterating you'll get the last value written to file i.e. just the email address.
You currently have
foreach (... ) ;
fwrite(...);.
but you probably mean
foreach( ... ) {
fwrite(...) ;
}
Been there, done that :)
HTH
Im using PHP to create a playlist. Two random songs are chosen from a directory, and their name and location are stored in an array and then written to a file via json_encode().
$arraySongs[] = array('name' => $songName , 'mp3' => $webUrl);
This works great. I can make a very long playlist, two songs at a time. Id also like to remove songs, so I have an AJAX powered delete button, that posts the id of the track to be deleted, PHP then loads the whole tracklist...
$decoded = json_decode(file_get_contents($tracklist),true);
and removes the given song from the array, then re encodes and re writes the json text file. This all works great.
The problem comes whenever I try to delete anything with a playlist of more than 10 items.
Typically, my song.json file goes [{name:song1,mp3:song url},{name:song2,mp3:song2 url}]
However, when I have a list of more than 10 items, the re encoded playlist looks like this:
[{ ... },{name:song9,mp3:song9 url}],[10,{"name":song10,mp3:song10 url}]
Why is my re-encoded array get that strange [10,{"name"... [11,{"name"... [12,{"name"...
but everything below 10 is always fine?
Thanks for reading this! Any suggestions would be greatly appreciated, this is driving me nuts!
Here is the code im Using:
<?php
$json = "song.php";
$decoded = json_decode(file_get_contents($json),true);
$playlist = array();
$names = array();
// Now Get i From Javascript
$i=$_POST['id'];
//Select i's Array
$collect_array=$decoded[$i];
while (list ($key, $val) = each ($collect_array)) {
//Remove i's Values
//echo "<br />$key -> $val <br>";
unset($decoded[$i]);
}
//Take all the remaining arrays
$collect_array=$decoded;
while (list ($key, $val) = each ($collect_array)) {
$arraySongs[] = array($key , $val);
}
// Our New Array ready for json.
$jsonData = json_encode($arraySongs);
// open song.php and scribble it down
$tracklist = $json;
$fh = fopen($tracklist, 'w') or die("can't open filename: $tracklist");
fwrite($fh, $jsonData);
fclose($fh);
?>
try removing elements with unset
debug your code (not posted in the thread, so do it yourself) by adding a line where you var_dump or print_r the whole thing before json_encode
or it's a bug in json_encode which would not be nice...
Encode the track ID on 2 or even 3 digits using the php function sprintf with parameter %02d.
This worked fine for me.