Php file_put_contents with multiple files - php

I am using file_put_contents to create and add info to a json file. This works successfully, however I need to create two files with different names (title.json and dates.json) - is this possible?
The reason I need to do this is because I am using twitter typeahead and it seems to only work with separate json files.
It works with a single file i.e;
file_put_contents(URL . '/title.json', json_encode($data));
However not with this;
file_put_contents(URL . '/title.json', '/dates.json',
json_encode($data));
I receive the following error message;
Warning: file_put_contents() expects parameter 3 to be long, string
given in C:\xampp\htdocs... on line 23
$sql = ("SELECT DISTINCT pub_id, title, place_name, party_name, publication_date FROM vw_ft_search");
$data = array();
while($row = $result->fetch_assoc())
{
$data[] = array('title' => utf8_encode($row['title']),
'pub_id' => utf8_encode($row['pub_id']),
'place_name' => utf8_encode($row['place_name']),
'party_name' => utf8_decode($row['party_name']),
'publication_date' => $row['publication_date']);
}
file_put_contents(URL . '/title.json','/dates.json', json_encode($data)); //line 23
I am probably missig something very easy, any advice is appreciated.

file_put_contents only accepts one file. Use a loop to insert in all files ->
$files = array('/title.json', '/dates.json');
then iterate through $files:
foreach($files as $file)
{
file_put_contents(URL.$file, json_encode($data));
}

Related

Export/import mysql json object via CSV

I'm working with a pair of PHP scripts. One script reads data from a MYSQL database and exports it to a csv file then a second script uploads the exported csv file to another MySQL database instance using csv. The structure of the database tables A (export) and B (import) are identical.
These scripts work fine for "normal" MySQL tables and column types. However, the import fails when we apply them to a MySQL table that stores a JSON object in one of the columns (MySQL column type is "json").
The script that exports the data works as expected, producing a CSV file with the JSON object surrounded by double quotes...just like the other values in the row.
The row in the exported CSV file looks like this (the last item is a complex json object, abbreviated for simplicity):
"894","Somebody","Related","2020-02-20","{"name1":"value1","name2":"value2","name3":"value3"}","expired"
In the PHP script to export the data it's essentially this:
$rowStr = "894","Somebody","Related","2020-02-20","{"name1":"value1","name2":"value2","name3":"value3"}","expired";
file_put_contents($filepath, trim($rowStr), FILE_APPEND);
No issues with the export. Row appears in the CSV file as expected (same format as above).
My code to read the csv into the other database looks like this:
$allRows = array_map('str_getcsv',file($fp)); // read the exported csv file where $fp is the path to the file
foreach($allRows as $i => $row) {
//get the col_names from the 2nd database table (identical to the first) where $ac-> is the class that handles data queries
$col_names = $ac->get_table_column_names('databasename',$tablename);
$update_arr = array();
foreach($col_names as $i => $cname) {
$update_arr[$cname['COLUMN_NAME']] = $val;
}
//and write the row to the 2nd db's table
$ac->create_update_table($update_arr,$tablename,FALSE);
}
And, if it matters, here are the Queries used in the "get_table_column_names" and "create_update_table" functions:
get_table_column_names //Using PDO
SELECT COLUMN_NAME,COLUMN_DEFAULT,DATA_TYPE FROM information_schema.columns WHERE table_schema = :db AND table_name = :table
create_update_table
INSERT INTO 'tablename' (field1, field2, field3, field4,json_object_column) VALUES ("894","Somebody","Related","2020-02-20","{"name1":"value1","name2":"value2","name3":"value3"}")
The problem is that, when importing, the row is converted to an array like this:
array (
[0] = "894",
[1] = "Somebody",
[2] = "Related",
[3] = "2020-02-20",
[4] = "{name1":"value1",
[5] = "name2:"value2", //should be part of node 4
[6] = "name3:"value3"}", //should be part of node 4
[7] = "expired"
);
What's happening is that the "," inside the JSON object is being treated as a field separator and the JSON is broken up into array nodes. Other than writing a script to detect fields that start with "{ and end with }", how can I read the entire json string as one field (as it is in the database)? or, perhaps, is there a better way to output the string so that it can be read as one item?
If instead of just writing out the data using something like file_put_contents() you use some of the methods designed for CSV files, this will do most of the work for you...
To write the data use fputcsv() and this escapes the delimiter (in this case the " becomes "")...
$row = ["894","Somebody","Related","2020-02-20",'{"name1":"value1","name2":"value2","name3":"value3"}',"expired"];
$fh = fopen($filepath, "a");
fputcsv($fh, $row);
fclose($fh);
which will write to the file
894,Somebody,Related,2020-02-20,"{""name1"":""value1"",""name2"":""value2"",""name3"":""value3""}",expired
and then to read from the file, just read a line at a time and use fgetcsv()...
$fh = fopen($filepath, "r");
print_r(fgetcsv($fh)); // This in a loop to read all lines
fclose($fh);
which shows
Array
(
[0] => 894
[1] => Somebody
[2] => Related
[3] => 2020-02-20
[4] => {"name1":"value1","name2":"value2","name3":"value3"}
[5] => expired
)
One way of solving this is to create a new copy of the array and manipulate the new array
and add json as a sliced part of the original array.
$allRows = array_map('str_getcsv',file($fp));
$new_arr = [];
foreach($allRows[0] as $key=>$item) {
$json = false;
if (substr($item,0,1) == '{') {
$json_start = $key;
$json = true;
}
if (substr($item,-2,2) == '}"') {
$json_stop = $key;
$json = true;
//Slice json-part from original array (in your case 4,5,6)
$sl = array_slice($allRows[0], $json_start, ($json_stop-$json_start)+1);
//Add the sliced part into key where json started
$new_arr[$json_start] = implode('',$sl);
}
if ($json === false) $new_arr[] = $item;
}
And then you have your expected array in $new_arr.

convert multiple json to php array

I have json file which contains multiple json objects.
Example
{"t":"abc-1","d":"2017-12-29 12:42:53"}
{"t":"abc-2","d":"2017-12-29 12:43:05"}
{"t":"abc-3","d":"2017-12-30 14:42:09"}
{"t":"code-4","d":"2017-12-30 14:42:20"}
Want to read this file and store into database, but I couldn't convert json to php array which further I can store into database.
I tried json_decode function, but its not working. I search for this but in every link its showing use json_decode. Below is my code
$filename = "folder/filename.json";
$data = file_get_contents($filename);
echo $data;
$tags = json_decode($data, true);
echo"<pre>";print_r($tags);exit;
$data is echoed but not the $tags.
Thanks in advance.
Make array of objects and use it later
$j = array_map('json_decode', file('php://stdin'));
print_r($j);
demo
If it's only four lines you can explode and json_decode each line and add it to an array.
$s = '{"t":"abc-1","d":"2017-12-29 12:42:53"}
{"t":"abc-2","d":"2017-12-29 12:43:05"}
{"t":"abc-3","d":"2017-12-30 14:42:09"}
{"t":"code-4","d":"2017-12-30 14:42:20"}';
$arr = explode(PHP_EOL, $s);
Foreach($arr as $line){
$json[] = json_decode($line,true);
}
Var_dump($json);
https://3v4l.org/97m0E
Multiple objects in a row should be enclosed in a json array and separated with comma like elements.So you need a [ ] at the start and end of the file.Also you could close the pre tag
Either you should fix the file generating that 'json' or you can use fgets to get one line at a time, and use json decode on every line
As pointed by other, JSON which you shared isn't valid. And, I think, it is stored in your file in same fashion. I would suggest to read this file line by line each line then you can decode.
$handle = fopen("folder/filename.json", "r");
if ($handle) {
while (($line = fgets($handle)) !== false) {
$tags = json_decode($line, true);
echo"<pre>";print_r($tags);exit;
}
fclose($handle);
} else {
// error opening the file.
}
Assuming a file called `filename.json` contains the following lines
{"t":"abc-1","d":"2017-12-29 12:42:53"}
{"t":"abc-2","d":"2017-12-29 12:43:05"}
{"t":"abc-3","d":"2017-12-30 14:42:09"}
{"t":"code-4","d":"2017-12-30 14:42:20"}
So each one is a separate json entity
$filename = "folder/filename.json";
$lines=file( $filename );
foreach( $lines as $line ){
$obj=json_decode( $line );
$t=$obj->t;
$d=$obj->d;
/* do something with constituent pieces */
echo $d,$t,'<br />';
}
Your JSON is invalid, as it has multiple root elements
Fixing it like the following should work (note the [, ] and commas):
[
{"t":"abc-1","d":"2017-12-29 12:42:53"},
{"t":"abc-2","d":"2017-12-29 12:43:05"},
{"t":"abc-3","d":"2017-12-30 14:42:09"},
{"t":"code-4","d":"2017-12-30 14:42:20"}
]
If you cannot influence how the JSON file is created, you will need to create your own reader, as PHP is not built to support invalid formatting. You could separate the file by new lines and parse each one individually.

Export multiple json files

I'm using wordpress with nextgen gallery, and i would like to get the images from a specific gallery id in a seperate JSON file. So far i have this code, but that just exports the images from the given ID (in this case; 7). I know there must be some kind of foreach function, but my javascript knowledge won't go that far.
Could someone help me with this one? The code i have so far:
mysql_connect("localhost","DB-NAME","DB-PASSWORD");
mysql_select_db("DB-NAME");
$result=mysql_query("SELECT filename FROM zs_ngg_pictures WHERE galleryid = '7' ORDER BY sortorder LIMIT 3");
while($row=mysql_fetch_assoc($result)){
$output[]=$row;
}
$fp = fopen('result.json', 'w');
fwrite($fp, json_encode($output));
fclose($fp);
echo json_encode($output);
If I correctly understand what you're trying to achieve you can simply get all the pictures, put them in a multidimensional array:
$result=mysql_query("SELECT filename, galleryid FROM zs_ngg_pictures ORDER BY sortorder");
while($row=mysql_fetch_assoc($result)){
$output[$row['galleryid']][] = $row['filename'];
}
and you have a PHP array like this (if you got two galleries with id 7 and 23):
array(
7 => array('file7-1.jpg','file7-2.jpg','file7-3.jpg'),
23 => array('file23-1.jpg','file23-2.jpg'),
);
so the result.json file will be like:
{ "7" : ["file7-1.jpg", "file7-2.jpg", "file7-3.jpg"],
"23" : ["file23-1.jpg", "file23-2.jpg"] }
If you want separate files for each galleryid you can then loop and save files with different names:
foreach($output as $gall_id => $gallery) {
file_put_contents('result-'.$gall_id.'.json', json_encode($gallery));
}
and you'll have two files named result-7.json and result-23.json
Instead of doing complicated fopen() stuff and echo:
$fp = fopen('result.json', 'w');
fwrite($fp, json_encode($output));
fclose($fp);
echo json_encode($output);
you can simply call:
file_put_contents('result.json', json_encode($output));
readfile('result.json');
or you skip the file and simple output the JSON:
echo json_encode($output));

Using get_file_contents() to make an array

I am trying to upload a CSV file and get it's contents into an array, but I am getting this an error: (Multiples of this error on each line after 10)
Notice: Undefined offset: 1 in C:\xampp\htdocs\amazon\upload_file.php on line 10
Below is a sample of my code:
if ($handle = file_get_contents($_FILES["file"]["tmp_name"])) {
$data = array();
while ($csv = array(file_get_contents($_FILES["file"]["tmp_name"]))) {
$data = array(
'order-id' => $csv[0],
'order-item-id' => $csv[1], //This is line 10.
'purchase-date' => $csv[2],
'payments-date' => $csv[3],
file() opens puts each as an array element. fgetcsv() and its family of functions are very useful when dealing with csv files.
Your code array(file_get_contents($_FILES["file"]["tmp_name"])) will only ever have one element because file_get_contents returns a string.
This issue comes if your file has only one line.
I guess you need to do this.
$row = explode(",", $csv[0]);
$data = array(
'order-id' => $fileArray[0],
'order-item-id' => $row[1], //This is line 10.
'purchase-date' => $row[2],
'payments-date' => $row[3]
);
Also, you can use functions like fgetcsv() to parse your CSV file.

Music playlist via PHP. json_encode() array limit?

Im using PHP to create a playlist. Two random songs are chosen from a directory, and their name and location are stored in an array and then written to a file via json_encode().
$arraySongs[] = array('name' => $songName , 'mp3' => $webUrl);
This works great. I can make a very long playlist, two songs at a time. Id also like to remove songs, so I have an AJAX powered delete button, that posts the id of the track to be deleted, PHP then loads the whole tracklist...
$decoded = json_decode(file_get_contents($tracklist),true);
and removes the given song from the array, then re encodes and re writes the json text file. This all works great.
The problem comes whenever I try to delete anything with a playlist of more than 10 items.
Typically, my song.json file goes [{name:song1,mp3:song url},{name:song2,mp3:song2 url}]
However, when I have a list of more than 10 items, the re encoded playlist looks like this:
[{ ... },{name:song9,mp3:song9 url}],[10,{"name":song10,mp3:song10 url}]
Why is my re-encoded array get that strange [10,{"name"... [11,{"name"... [12,{"name"...
but everything below 10 is always fine?
Thanks for reading this! Any suggestions would be greatly appreciated, this is driving me nuts!
Here is the code im Using:
<?php
$json = "song.php";
$decoded = json_decode(file_get_contents($json),true);
$playlist = array();
$names = array();
// Now Get i From Javascript
$i=$_POST['id'];
//Select i's Array
$collect_array=$decoded[$i];
while (list ($key, $val) = each ($collect_array)) {
//Remove i's Values
//echo "<br />$key -> $val <br>";
unset($decoded[$i]);
}
//Take all the remaining arrays
$collect_array=$decoded;
while (list ($key, $val) = each ($collect_array)) {
$arraySongs[] = array($key , $val);
}
// Our New Array ready for json.
$jsonData = json_encode($arraySongs);
// open song.php and scribble it down
$tracklist = $json;
$fh = fopen($tracklist, 'w') or die("can't open filename: $tracklist");
fwrite($fh, $jsonData);
fclose($fh);
?>
try removing elements with unset
debug your code (not posted in the thread, so do it yourself) by adding a line where you var_dump or print_r the whole thing before json_encode
or it's a bug in json_encode which would not be nice...
Encode the track ID on 2 or even 3 digits using the php function sprintf with parameter %02d.
This worked fine for me.

Categories