I have some data in JSON format in a text file as below. I need to insert this data into mysql using php but can't do it.
{"address":"+92 334 6629424","service_center":"Test Sending Sms","id":3,"locked":0,"person":0,"protocol":0,"read":0,"reply_path_present":2,"seen":0,"error_code":0,"status":1,"date":1873326412,"thread_id":1,"type":-1}
My PHP file has the code like this.
<?php $source_file = "SMS2012-05-21.txt"; $handle = fopen("SMS2012-05-21.txt", "r");
$col_names = implode(",", fgetcsv($handle)); // Getting comma separated list of col name
$link = mysql_connect('localhost', 'root', '');
mysql_select_db("messages");
while (($data = fgetcsv($handle)) !== FALSE) {
$values = "";
foreach($data as $key => $value) {
if ($key != 0) $values .= ", ";
$values .= "'".mysql_escape_string($value)."'";
}
mysql_query('INSERT INTO messages ('.$col_names.') VALUES ('.$values.')');
}
?>
I can't find any result nor any error. Could any one please help me in this regard that where i am wrong?
You should use json_decode function to manipulate json data.
<?php
$source_file = "SMS2012-05-21.txt";
$string = file_get_contents($source_file);
$json = json_decode($string,true);
//DB Conn Handling Stuff
$cols = array(); $values = array();
foreach($json as $key=>$value)
{
array_push($cols,'`' . $key . '`');
if(is_string($value))
{
array_push($values,'\''.$value.'\'');
}
else
{
array_push ($values, $value);
}
}
$col_name = implode(',',$cols);
$col_value = implode(',',$values);
$query = 'INSERT INTO messages('.$col_name.') VALUES ('.$col_value.')';
mysql_query($query,$connection) or die(echo mysql_error());
?>
Maybe I've missed something, you should use it in this way:
<?php $source_file = "SMS2012-05-21.txt";
$handle = fopen("SMS2012-05-21.txt", "r");
$data = fread($handle, filesize($source_file));
$jsonArray = json_decode($data, true);
$keys = implode(',', array_keys($jsonArray));
$values = "'" . implode("','", $jsonArray) . "'";
$link = mysql_connect('localhost', 'root', '');
mysql_select_db("messages");
mysql_query('INSERT INTO messages ('.$keys.') VALUES ('.$values.')');
Related
As the title says, i want to parse 4 different files and insert them into 1 sql row. I have parsed the files so i have the information, however its not inserting into my table.
This is the code I have. I eliminated the actual sql names and values to save time. My two main questions are 1) is this the right way to do this and 2) is there a better way.
<?php
$connect = mysqli_connect("reserve1", "root", "","server_31");
$dir = "/Users/Administrator/Desktop/Reserve1";
if (is_dir($dir)) {
if ($dh = opendir($dir)) {
foreach(glob("*.json") as $filename) {
$data = file_get_contents($filename);
$array = json_decode($data, true);
foreach($array[0]as $row) {
$sql = "INSERT INTO servers_updated (---)
VALUES ('---)";
$connect->query($sql);
}
foreach(glob("*_processor.json") as $filename) {
$data = file_get_contents($filename);
$info = json_decode($data, true);
foreach($info[1] as $row) {
$sql = "INSERT INTO servers_updated
(--- ) VALUES (---)";
$connect->query($sql);
}
foreach(glob("*_drives.json") as $filename) {
$data = file_get_contents($filename);
$info = json_decode($data, true);
foreach($info[1] as $row) {
$sql=" INSERT INTO servers_updated (---) VALUES (---)";
$connect->query($sql);
}
foreach(glob("*_memory.json") as $filename) {
$data = file_get_contents($filename);
$stuff = json_decode($data, true);
foreach($stuff[1] as $row) {
$sql =" INSERT INTO servers_updated
(--- ) VALUES (----)";
$connect->query($sql);
}
}
}
}
}
}
}
?>
I have a solution with PHP as server-side, Vue JS for front-end and MySQL as DB.
The UI bundles data as JSON and posts it to PHP through axios, and PHP in turn will decode the JSON and inserts into MySQL.
Here is my PHP code (omitting the other lines like connecting etc.):
$data = file_get_contents("php://input");
$jsonData = json_decode($data, true);
//echo var_dump($jsonData);
// Below is the jsonData as dumped
//[{"candidate_id":"SM_009","FirstName":"test","LastName":"dummy","DOB":"1990-06-05"}]
$tableName = 'profile';
foreach((array)$jsonData as $id=>$row) {
$insertPairs = array();
foreach ((array)$row as $key=>$val) {
$insertPairs[addslashes($key)] = addslashes($val);
}
$insertKeys = '`' . implode('`,`', array_keys($insertPairs)) . '`';
$insertVals = '"' . implode('","', array_values($insertPairs)) . '"';
$sql = "INSERT INTO `{$tableName}` ({$insertKeys}) VALUES ({$insertVals});" ;
//echo var_dump($sql);
$stmt = $con->prepare($sql);
$stmt->execute();
}
However, here is the actual insert statement generated, which is obviously wrong.
INSERT INTO `profile` (`0`) VALUES ("[{\"candidate_id\":\"SM_009\",\"FirstName\":\"test\",\"LastName\":\"dummy\",\"DOB\":\"1990-06-05\"}]");
Where am I doing wrong? Any help would be greatly appreciated..
Thanks
Note: When I use the same dumped jsondata as hardcoded string, it works.
$data ='[{"candidate_id":"SM_009","FirstName":"test","LastName":"dummy","DOB":"1990-06-12"}]';
//$data = file_get_contents("php://input");
...
Generated statement:
"INSERT INTO `profile` (`candidate_id`,`FirstName`,`LastName`,`DOB`) VALUES ("SM_009","test","dummy","1990-06-12");"
The reason you are still receiving the json in your insert statement is because you decoded the first part of your json string and received the data array which still contains the json string inside of it. To resolve this just decode the $jsonData variable again like so:
<?php
$data = file_get_contents("php://input");
$jsonData = json_decode($data, true);
$jsonData = json_decode($jsonData['data'], true); //Decode the data as well
$tableName = 'profile';
foreach((array)$jsonData as $id => $row){
$insertPairs = array();
foreach ((array)$row as $key=>$val) {
$insertPairs[addslashes($key)] = addslashes($val);
}
$insertKeys = '`' . implode('`,`', array_keys($insertPairs)) . '`';
$insertVals = '"' . implode('","', array_values($insertPairs)) . '"';
$sql = "INSERT INTO `{$tableName}` ({$insertKeys}) VALUES ({$insertVals});" ;
$stmt = $con->prepare($sql);
$stmt->execute();
}
You can check out a working example here: https://ideone.com/i86iVP
You can do like this:
$jsonString = '{"data":[{"candidate_id":"SM_009","FirstName":"test","LastName":"dummy","DOB":"1990-06-12"}]}';
$jsonArray = json_decode($jsonString,true);
$data = $jsonArray['data'];
//$data = json_decode(file_get_contents("php://input"),true);
//$json = json_decode($data, true); $json = $data['data'];
//json_decode($_GET['data']);
$tableName = 'profile';
foreach((array)$data as $id=>$row) {
$insertPairs = array();
foreach ((array)$row as $key=>$val) {
$key = addslashes($key);
$val = addslashes($val);
$insertPairs[] = " `{$key}` = '{$val}' ";
}
$sqlInsert = implode(", ", $insertPairs);
$sql = "INSERT INTO `{$tableName}` SET {$sqlInsert} ";
echo var_dump($sql);
/*
string(126) "INSERT INTO `profile` SET `candidate_id` = 'SM_009' , `FirstName` = 'test' , `LastName` = 'dummy' , `DOB` = '1990-06-05' "
*/
// $stmt = $con->prepare($sql);
// $stmt->execute();
}
I'm trying to create a table using my csv file with fields and I also have column headers inside my csv file. However, when I try to run it.. it just shows and gives me a query... I'm trying to find out what seems to be the problem and I'm stuck with it... can you help me on this? Thank
Here's my code
<?php
$server = "localhost";
$username = "root";
$pass = "";
$dbname = "test";
$conn = new PDO("mysql:host=$server;dbname=$dbname", $username,
$pass);
$conn->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
// Parameters: filename.csv table_name
$file = 'C:\Users\HP\Desktop\ACC.DBF.csv';
$table = 'acc';
// get structure from csv and insert db
ini_set('auto_detect_line_endings',TRUE);
$handle = fopen($file,'r');
// first row, structure
if ( ($data = fgetcsv($handle) ) === FALSE ) {
echo "Cannot read from csv $file";die();
}
$fields = array();
$field_count = 0;
for($i=0;$i<count($data); $i++) {
$f = strtolower(trim($data[$i]));
if ($f) {
// normalize the field name, strip to 20 chars if too long
$f = substr(preg_replace ('/[^0-9a-z]/', '_', $f), 0, 20);
$field_count++;
$fields[] = $f.' VARCHAR(255)';
}
}
$sqlcreate = $conn->prepare("CREATE TABLE $table (" . implode(', ', $fields) . ')');
$sqlcreate->execute();
echo "Create Table success" . "<br /><br />";
//$db->query($sql);
while ( ($data = fgetcsv($handle) ) !== FALSE ) {
$fields = array();
for($i=0;$i<$field_count; $i++) {
$fields[] = '\''.addslashes($data[$i]).'\'';
}
$sqlinsert = $conn->prepare("Insert into $table values(" . implode(', ',
$fields) . ')');
$sqlinsert->execute();
echo "Insert Table success" ;
}
fclose($handle);
ini_set('auto_detect_line_endings',FALSE);
?>
I have created a utility script which does the same thing that you are trying.. Please check if it helps you.
<?php
$fileName = './WP.csv';
function connectDB()
{
$server = "mysql2345";
$username = "root";
$pass = "root";
$dbname = "sc1";
$conn = new PDO("mysql:host=$server;dbname=$dbname", $username, $pass);
$conn->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
return $conn;
}
function createDb($csv_path, $db)
{
if (($csv_handle = fopen($csv_path, "r")) === false) {
throw new Exception('Cannot open CSV file');
}
if(!isset($delimiter)) {
$delimiter = ',';
}
if (!isset($table)) {
$table = preg_replace("/[^A-Z0-9]/i", '', basename($csv_path));
}
if (!isset($fields)) {
$fields = array_map(function ($field){
return $field;
}, fgetcsv($csv_handle, 0, $delimiter));
}
$create_fields_str = join(', ', array_map(function ($field){
return "$field VARCHAR(200) NULL";
}, $fields));
echo $create_table_sql = "CREATE TABLE IF NOT EXISTS $table ($create_fields_str)";
$db->query($create_table_sql);
return ['table'=>$table, 'fields'=>$fields];
}
function loadData($fileName, $tableName, $fields, $db)
{
$fieldStr = implode(',', $fields);
$query = <<<eof
LOAD DATA LOCAL INFILE '$fileName'
INTO TABLE $tableName
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\r'
($fieldStr)
eof;
echo $query;
$db->query($query);
}
$db = connectDB();
$tableInfo = createDb($fileName, $db);
loadData($fileName, $tableInfo['table'], $tableInfo['fields'], $db);
I wrote this PHP code to extract values from a JSON file and insert them into a MySQL database.
<?php
//connect to mysql db
$con = mysqli_connect("localhost","root","","db_tweets") or die('Could not connect: ' . mysql_error());
//read the json file contents
$jsondata = file_get_contents('prova.json');
//convert json object to php associative array
$data = json_decode($jsondata, true);
foreach ($data as $u => $z){
foreach ($z as $n => $line){
//get the tweet details
$text = $line['text'];
$id_tweet = $line['id_str'];
$date = $line['created_at'];
$id_user = $line['user']['id_str'];
$screen_name = $line['user']['screen_name'];
$name = $line['user']['name'];
$sqlu = "INSERT INTO user(id_user, screen_name, name)
VALUES ('".$id_user."', '".$screen_name."', '".$name."')";
}
}
if(!mysqli_query($con, $sqlu))
{
die('Error : ' . mysql_error());
}
?>
In so doing it insert the values always in the first line of my table, overwriting the previous value. So it remains only the last.
How can I:
1) insert all values in multiple lines?
2) to parse multiple JSON files?
Try this.
You are just executing the last query cause you mysqli_query() is outside loop.
Method 1:
<?php
//connect to mysql db
$con = mysqli_connect("localhost","root","","db_tweets") or die('Could not connect: ' . mysql_error());
//read the json file contents
$jsondata = file_get_contents('prova.json');
//convert json object to php associative array
$data = json_decode($jsondata, true);
foreach ($data as $u => $z){
foreach ($z as $n => $line){
//get the tweet details
$text = $line['text'];
$id_tweet = $line['id_str'];
$date = $line['created_at'];
$id_user = $line['user']['id_str'];
$screen_name = $line['user']['screen_name'];
$name = $line['user']['name'];
$sqlu = "INSERT INTO user(id_user, screen_name, name)
VALUES ('".$id_user."', '".$screen_name."', '".$name."')";
if(!mysqli_query($con, $sqlu))
{
die('Error : ' . mysql_error());
}
}
}
?>
Method 2:
<?php
//connect to mysql db
$con = mysqli_connect("localhost","root","","db_tweets") or die('Could not connect: ' . mysql_error());
//read the json file contents
$jsondata = file_get_contents('prova.json');
//convert json object to php associative array
$data = json_decode($jsondata, true);
$values = "";
foreach ($data as $u => $z){
foreach ($z as $n => $line){
//get the tweet details
$text = $line['text'];
$id_tweet = $line['id_str'];
$date = $line['created_at'];
$id_user = $line['user']['id_str'];
$screen_name = $line['user']['screen_name'];
$name = $line['user']['name'];
$values .= "('".$id_user."', '".$screen_name."', '".$name."'),";
}
}
if(!empty($values)) {
$values = substr($values, 0, -1);
$sqlu = "INSERT INTO user(id_user, screen_name, name) VALUES {$values}";
if(!mysqli_query($con, $sqlu))
{
die('Error : ' . mysql_error());
}
}
?>
Answer for multiple files:
<?php
//connect to mysql db
$con = mysqli_connect("localhost","root","","db_tweets") or die('Could not connect: ' . mysql_error());
$files = array("prova.json", "file2.json");
foreach ($files as $file) {
//read the json file contents
$jsondata = file_get_contents($file);
//convert json object to php associative array
$data = json_decode($jsondata, true);
$values = "";
foreach ($data as $u => $z) {
foreach ($z as $n => $line) {
//get the tweet details
$text = $line['text'];
$id_tweet = $line['id_str'];
$date = $line['created_at'];
$id_user = $line['user']['id_str'];
$screen_name = $line['user']['screen_name'];
$name = $line['user']['name'];
$values .= "('" . $id_user . "', '" . $screen_name . "', '" . $name . "'),";
}
}
if (!empty($values)) {
$values = substr($values, 0, -1);
$sqlu = "INSERT INTO user(id_user, screen_name, name) VALUES {$values}";
if (!mysqli_query($con, $sqlu)) {
die('Error : ' . mysql_error());
}
}
}
?>
With every loop you're overwriting the last $sqlu value before ever passing that variable to the mysqli_query function after the loops. So once the loops are completed you're left with the last assigned value to $sqlu, and that's the only one that gets executed.
Instead, execute your query inside the loop and...
Use PHP mysqli_ functions mysqli_prepare, mysqli_stmt_bind_param, and mysqli_stmt_execute to simplify and secure your query:
//connect to mysql db
$con = mysqli_connect("localhost","root","","db_tweets") or die('Could not connect: ' . mysql_error());
// prepare your insert query
$stmt = mysqli_prepare($con, 'INSERT INTO user(id_user, screen_name, name) VALUES (?, ?, ?)');
// bind the upcoming variable names to the query statement
mysqli_stmt_bind_param($stmt, 'iss', $id_user, $screen_name, $name);
// loop over JSON files
$jsonfiles = array('prova.json', 'provb.json', 'provc.json');
foreach ( $jsonfiles as $jsonfile ) {
//read the json file contents
$jsondata = file_get_contents($jsonfile);
//convert json object to php associative array
$data = json_decode($jsondata, true);
foreach ($data as $u => $z){
foreach ($z as $n => $line){
//get the tweet details
$id_user = $line['user']['id_str'];
$screen_name = $line['user']['screen_name'];
$name = $line['user']['name'];
// execute this insertion
mysqli_stmt_execute($stmt);
}
}
}
So, this not only uses fewer database resources by preparing your query once, and has cleaner code, but also properly escapes your insertion values to protect against sql injection.
Added an array $jsonfiles containing any number of JSON filenames, and used a foreach construct to loop over the JSON files.
I've written the following PHP script that will pull Base64 encoded pictures out of a database and write them to file. It also outputs a CSV, where each line has the Serial, Main Picture, Main Modified Date, Extra Pics, Extra Pics Modified Date.
<?php
date_default_timezone_set('America/Edmonton');
$serverName = "database";
$connectionInfo = array( "Database"=>"CRM_MSCRM");
$conn = sqlsrv_connect( $serverName, $connectionInfo);
if( $conn === false )
{
echo "Unable to connect.\n\n";
die( print_r( sqlsrv_errors(), true));
}
else
{
echo "Connected. Selecting trucks...\n\n";
}
$tsql = "SELECT * FROM CRM_MSCRM.dbo.Trader_Export_Simple";
$stmt = sqlsrv_query( $conn, $tsql);
if( $stmt === false )
{
echo "Error executing query.\n\n";
die( print_r( sqlsrv_errors(), true));
}
$csvData = array();
while ($row = sqlsrv_fetch_array($stmt))
{
$count = 1;
$mainpicsql = "SELECT * FROM CRM_MSCRM.dbo.TruckImages WHERE Serial = '".$row[0]."' AND MainPic = 1";
$mainpicstmt = sqlsrv_query( $conn, $mainpicsql);
while ($mainpicrow = sqlsrv_fetch_array($mainpicstmt))
{
$truck = $mainpicrow[1];
$mainfilename = $truck ."-". $count . ".png";
file_put_contents($mainfilename, base64_decode($mainpicrow[0]));
$mainpicdate = $mainpicrow[3]->format("d/m/Y h:m:s");
$mainfilename = "http://images.website/images/".$mainfilename;
echo $mainpicdate."\n";
}
$picsql = "SELECT * FROM CRM_MSCRM.dbo.TruckImages WHERE Serial = '".$row[0]."' AND MainPic = 0";
$picstmt = sqlsrv_query( $conn, $picsql);
$extrapicsdate = "";
$filenames = "";
while ($picrow = sqlsrv_fetch_array($picstmt))
{
$count++;
$filename = $picrow[1] ."-". $count . ".png";
file_put_contents($filename, base64_decode($picrow[0]));
$picdate = $picrow[3]->format("d/m/Y h:m:s");
$filenames .= "http://images.website/images/".$filename.";";
$extrapicsdate .= $picdate.";";
}
$filenames = rtrim($filenames, ";");
$extrapicsdate = rtrim($extrapicsdate, ";");
echo $filenames."\n";
echo $extrapicsdate."\n";
if ($truck != "") {
$csvData[] = array($truck, $mainfilename, $mainpicdate, $filenames, $extrapicsdate);
}
if ($filenames != "")
{
$filenames = "";
}
if ($extrapicsdate != "")
{
$extrapicsdate = "";
}
echo "Next truck...\n\n";
$truck = "";
$mainfilename = "";
$mainpicdate = "";
}
$fp = fopen('file.csv', 'w');
foreach ($csvData as $fields) {
fputcsv($fp, $fields);
}
//print_r($csvData);
sqlsrv_free_stmt( $stmt);
sqlsrv_free_stmt( $picstmt);
sqlsrv_close( $conn);
?>
I'd like to take the $truck from each line in this file, and search another CSV for the row containing that $truck, and append the columns from the matching row in this CSV to that one. Rinse and repeat for all lines in this CSV file
I just spent a while making this work after not using PHP for several years so my head's a little sore. Anyone want to point me in the right general direction on this part??
Thanks for the help!