JSON Array to PHP Array Multidimensional - php

i have json data like this :
{
"response": {
"count": 212,
"list": [
{
"code": "02007",
"name": "swept the room",
"rate": 750000,
"withValue": false
},
{
"code": "02005",
"name": "mop room",
"rate": 600000,
"withValue": false
},
{
"code": "02003",
"name": "buying food",
"rate": 175000,
"withValue": false
}
]
},
"metaData": {
"message": "OK",
"code": 200
}
}
and i have table schema like this :
mysql> desc master_service;
+----------------+-------------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+----------------+-------------+------+-----+---------+----------------+
| id | int(25) | NO | PRI | NULL | auto_increment |
| code | varchar(10) | YES | | NULL | |
| name | varchar(20) | YES | | NULL | |
| rate | double | YES | | NULL | |
| withvalue | tinyint(1) | YES | | NULL | |
+----------------+-------------+------+-----+---------+----------------+
and my coding like this.
//using php pdo
include_once 'db_connect.php';
$data = json_decode($response, true);
$tempservice = array();
if(isset($data['response']) && isset($data['response']['list']))
{
//use foreach on ['response']['list'] index - here are teachers data stored
foreach($data['response']['list'] as $service)
$tempservice[$kesadaran['code']] = $service['name'];
}
foreach($tempservice as $key =>$value) {
$sql = "insert into master_service(code,name) Values ('$key','$value')";
$st = $dbh->prepare($sql);
$st->execute ($data);
}
it can only save the database in the form of codes and names. i want rate and withValue can be save on database

Just use a foreach loop and add the missing columns, no need for a restructured $tempservice array:
include_once 'db_connect.php';
$data = json_decode($response, true);
// Simplified if, checking for isset($data['response') is redundant
if (isset($data['response']['list']))
{
// Prepare the statement using placeholders
$sql = "INSERT INTO master_service (code,name,rate,withValue) VALUES (:code,:name,:rate,:withValue)";
$stmt = $dbh->prepare($sql);
foreach($data['response']['list'] as $row)
{
// Execute query with parameters
$stmt->execute($row);
}
}
Important: I replaced the variables in your query with placeholders. This way it is safe against SQL Injection. I also put the prepare outside of the loop so that it doesn't "reprepare" on every iteration.

You can try this:-
include_once 'db_connect.php';
$data = json_decode($response, true);
$tempservice = [];
if(isset($data['response']) && isset($data['response']['list'])){
//use foreach on ['response']['list'] index - here are teachers data stored
foreach($data['response']['list'] as $service){
$withValue = ($service['withValue']) ? 1 : 0;
$tempservice[$service['code']] = ['name'=>$service['name'], 'rate'=>$service['rate'], 'withvalue'=>$withValue];
}
}
foreach($tempservice as $key =>$value) {
$sql = "insert into master_service(code, name, rate, withvalue) Values ('$key', '".$value['name']."', '".$value['rate']."', '".$value['withvalue']."')";
$st = $dbh->prepare($sql);
$st->execute ();
}
Note: please fix any syntax error. Also you can skip creating tempservice array and can execute insert sql there

foreach($data['response']['list'] as $value) {
$sql = "insert into master_service(code,name,rate,withvalue) Values ('{$value['code']}','{$value['name']}', '{$value['rate']}', '{$value['withValue']}')";
$st = $dbh->prepare($sql);
$st->execute ($data);
}
sorry i forgot the true parameter on json_decode.. edited

Related

Slow data insertion into mysql database

I am trying to make a csv upload page for an application that Im building. It needs to be able to upload thousands of rows of data in seconds each row including a first name, last name, and phone number. The data is being uploaded to a vm that is running ubuntu server. When I run the script to upload the data it takes almost 2 minutes to upload 1500 rows. The script is using PDO, I have also made a test script in python to see if It was a php issue and the python script is just as slow. I have made csv upload scripts in the past that are exactly the same that would upload thousands of rows in seconds. We have narrowed the issue down to the script as we have tested it on other vms that are hosted closer to us and the issue still persist. Is there an obvious issue with the script or PDO that could be slowing it down? Below is the code for the script.
<?php
$servername =[Redacted];
$username = [Redacted];
$password = [Redacted];
try {
$conn = new PDO("mysql:host=$servername;dbname=test", $username, $password);
// set the PDO error mode to exception
$conn->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
echo "Connected successfully";
} catch(PDOException $e) {
echo "Connection failed: " . $e->getMessage();
}
echo print_r($_FILES);
$fileTmpPath = $_FILES['fileToUpload']['tmp_name'];
$fileName = $_FILES['fileToUpload']['name'];
$fileSize = $_FILES['fileToUpload']['size'];
$fileType = $_FILES['fileToUpload']['type'];
$CSVfp = fopen($fileTmpPath, "r");
$final_array = [];
while (($data = fgetcsv($CSVfp)) !== false) {
$recived = [];
foreach ($data as $i) {
array_push($recived, $i );
}
array_push($final_array, $recived);
}
echo print_r($final_array);
fclose($CSVfp);
$non_compliant_rows = [];
foreach ($final_array as $key => $row){
$fname = preg_replace('/[^A-Za-z0-9]/', "", $row[0]);
$lname = preg_replace('/[^A-Za-z0-9]/', "", $row[1]);
$mobileNumber = preg_replace( '/[^0-9]/i', '', $row[2]);
$sanatized_row = array($fname, $lname, $mobileNumber);
$recived[$key] = $sanatized_row;
if (strlen($mobileNumber) > 10 or strlen($mobileNumber) < 9){
array_push($non_compliant_rows, $final_array[$key]);
unset($final_array[$key]);
}
}
$final_array = array_values($final_array);
echo print_r($final_array);
foreach($final_array as $item){
try{
$stmt = $conn->prepare("INSERT INTO bulk_sms_list(fname, lname, pn, message, send) VALUES (?, ?, ?, 'EMPTY', 1) ;");
$stmt->execute($item);
}catch(PDOException $e){
echo $e;
}
}
echo "done";
The phone numbers column has a UNIQUE constraint to prevent us from having duplicate phone numbers. We have tried to use batch inserting but if one row doesn't comply with the constraints then all of the insertions fail.
below is the schema of the table:
+---------+------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+---------+------+------+-----+---------+----------------+
| id | int | NO | PRI | NULL | auto_increment |
| fname | text | NO | | NULL | |
| lname | text | NO | | NULL | |
| pn | text | NO | UNI | NULL | |
| message | text | YES | | NULL | |
| send | int | NO | | 1 | |
+---------+------+------+-----+---------+----------------
EDIT:
I have timed the clean up portion of the script at the request of #aynber. the time of the clean up was 0.24208784103394 Seconds. The time it took to do the sql portion is 108.2597219944 Seconds
The fastest solution should be to use LOAD DATA LOCAL INFILE. Since you answered in a comment that duplicate phone numbers should result in skipping a row, you can use the IGNORE option.
Load directly from the file, instead of processing it with PHP.
You can do some of your transformations in the LOAD DATA statement.
For example:
LOAD DATA INFILE ? IGNORE INTO TABLE bulk_sms_list
FIELDS TERMINATED BY ','
(#fname, #lname, #mobile)
SET fname = REGEXP_REPLACE(#fname, '[^A-Za-z0-9]', ''),
lname = REGEXP_REPLACE(#lname, '[^A-Za-z0-9]', ''),
pn = IF(LENGTH(#mobile) BETWEEN 9 AND 10, #mobile, NULL),
message = 'EMPTY',
send = 1;
Then follow the import with some cleanup to get rid of any rows with invalid phone numbers:
DELETE FROM bulk_sms_list WHERE pn IS NULL;
Read https://dev.mysql.com/doc/refman/8.0/en/load-data.html for more information.

How to create JSON API service from mysql database in php

I have following mysql database table
id | majorFoodName | cropName | foodType | foodName | quantity | form
1 | Recipe1 | Rice | Breakfast | Foodname1 | 500+60 | 2000
4 | Recipe2 | Rice | Breakfast | Foodname2 | 500 | 1000
6 | Recipe1 | Wheat | Breakfast | Foodname2 | 518 | 1000
I have written following php code to give JSON API output
$sql = "SELECT * FROM food WHERE cropName = 'Rice' AND foodName =
'Foodname1' ";
$result = mysqli_query($connect, $sql);
$num_rows = mysqli_num_rows($result);
if ($num_rows > 0) {
$jsonData = array();
while ($array = mysqli_fetch_row($result)) {
$jsonData[] = $array;
}
}
class Emp {
public $majorFoods = "";
}
$e = new Emp();
$e->majorFoods = $jsonData;
header('Content-type: application/json');
echo json_encode($e);
I am getting following JSON output
{
"majorFoods": [
[
"1",
"Recipe1",
"Rice",
"Breakfast",
"Foodname1",
"500+60",
"2000"
]
]
}
I need to give following API JSON format for all cropName and all foodName
{
"Rice": [
{
"foodName1": {
"majorFoodName": "Receipe1",
"quantity": "500+60",
"form": "2000" }
"foodName2": {
"majorFoodName": "Receipe2",
"quantity": "500",
"form": "1000" }
]
"Wheat": [
{
"foodName2": {
"majorFoodName": "Receipe1",
"quantity": "518",
"form": "1000" }
]
}
Kindly help in improving the php code to get desired API JSON response.
When your building up your array of data, use the crop type as the main index and create sub arrays of the extra data you need to store.
while ($array = mysqli_fetch_assoc($result)) {
$cropData = array ( $array['foodName'] =>
array( 'majorFoodName' => $array['majorFoodName'],
'quantity' => $array['quantity'],
'form' => $array['form'] ));
$jsonData[$array['cropName']][] = $cropData;
}
Note that I use mysqli_fetch_assoc so that I can refer to the result fields with the column names.
The line
$jsonData[$array['cropName']][] = $cropData;
Accumulates all of the data for a particular crop name together, adding the new data to the end of the array (using []).

Data truncuated for a column error with PHP and mySQL?

I have written the following code on PHP and i am getting the error:
Data truncated for column 'datetime_gmt' at row 1;
Here is the code:
$lines = new SplFileObject('/home/file.txt');
$x = 0;
while(!$lines->eof()) {
$lines->next();
if($x == 0){
$lines->next();
}
$row = explode(',',$lines);
for($i = 0; $i<4; $i++){
if(!isset($row[$i])){
$row[$i] = null;
}
}
$y = (float) $row[1];
$z = (float) $row[2];
$load_query = "INSERT IGNORE INTO new (datetime_gmt,field2,field3)
VALUES ('".$row[0]."','".$y."','".$z."');";
$x++;
}
$lines = null;
The column is of type 'datetime' and has '0000-00-00 00:00:00' as DEFAULT, and it is the PRI of the table. If you are wondering about the "x" variable, it's for skipping the first 2 lines.
EDIT 1
Here is a sample data:
2013-12-11 8:22:00, 1.462E+12, 3.33E+11
2013-12-12 4:10:00, 1.462E+12, 3.33E+11
2013-12-13 11:52:00, 1.462E+12, 3.33E+11
And here is the description of the table "new":
Field | Type | Null | Key | Default | Extra
datetime_gmt | datetime |No | PRI |0000-00-00 00:00:00 |
field2 | bigint(20)|YES | |NULL |
field3 | bigint(20)|YES | |NULL |
using:
SELECT sum(char_length(COLUMN_NAME))
FROM TABLE_NAME;
I get 19 as the size of the column.
Please run this query directly into your database: INSERT IGNORE INTO new (datetime_gmt,field2,field3) VALUES ('2013-12-11 8:22:00','1462000000000','333000000000');
If that does not add a row, remove the default on the datetime_gmt column and re-try.
Note: You have a syntax error with your code.
Change this:
$load_query = "INSERT IGNORE INTO new (datetime_gmt,field2,field3)
VALUES ('".$row[0]"','".$y."','".$z."');";
To this:
$load_query = "INSERT IGNORE INTO new (datetime_gmt,field2,field3)
VALUES ('".$row[0]."','".$y."','".$z."');";
If the aforementioned doesn't work, try to have just engine substitution in your SQL Modes:
set ##sql_mode='no_engine_substitution';
Then make sure that it shows NO_ENGINE_SUBSTITUTION by running the following:
select ##sql_mode;
Then attempt to run your code again. The set ##sql_mode might not be server wide and it may only work for your current session.

PHP PDO lastinsertid of previous function

I feel like I'm really overly complicating this whole scenario. Hopefully somebody can help.
I have a form which submits data to two tables (items and uploads). The form data goes to items, and the attachment to uploads. Basically I'd like both tables to have a corresponding itemId column.
My two functions create() and uploadFile() both work. However, I'm not sure how to use the the lastInsertId value of $crud->create() in my variable named $itemId - see comments in my code.
Reduced versions of my functions are below, including comments.
class.crud.php
class crud {
private $db;
function __construct($DB_con) {
$this->db = $DB_con;
}
public function create($inv, $ip, $make){
$stmt = $this->db->prepare("INSERT INTO items (inv,ip,make) VALUES (:inv,:ip,:make");
$stmt->bindparam(":inv", $inv);
$stmt->bindparam(":ip", $ip);
$stmt->bindparam(":make", $make);
$stmt->execute();
return true;
}
public function uploadFile($itemId, $inv, $file, $file_type, $file_size) {
$stmt = $this->db->prepare("INSERT INTO uploads (itemId,inv,file,type,size) VALUES (:itemId,:inv,:file,:file_type,:file_size)");
$stmt->bindParam(":itemId", $itemId); // inserts 777
$stmt->bindParam(":inv", $inv);
$stmt->bindparam(":file", $file);
$stmt->bindparam(":file_type", $file_type);
$stmt->bindparam(":file_size", $file_size);
$stmt->execute();
return true;
}
}
add-data.php
if (isset($_POST['btn-save'])) {
$itemId = '777'; //this successfully inserts 777 into the uploads.itemId teble, but i'd like to insert the lastInsertId value of $crud->create()
$inv = $_POST['inv'];
$ip = $_POST['ip'];
$make = $_POST['make'];
$file = rand(1000, 100000) . "-" . $_FILES['file']['name'];
$file_loc = $_FILES['file']['tmp_name'];
$file_size = $_FILES['file']['size'];
$file_type = $_FILES['file']['type'];
$folder = "uploaded_files/";
if ($crud->create($inv, $ip, $make)) {
echo 'success';
} else {
echo 'error';;
}
if (move_uploaded_file($file_loc, $folder . $file)) {
$crud->uploadFile($itemId, $inv, $file, $file_type, $file_size);
}
}
<form method='post' enctype="multipart/form-data">
<input type='text' name='inv'>
<input type='text' name='ip'>
<input type='text' name='make'>
<input type='file' name='file'>
<button type="submit" name="btn-save"></button>
</form>
The structure of both my tables are as follows;
items (itemId is primary, unique and auto-increment)
+--------+---------+-----------------+-------+
| itemId | inv | ip | make |
+--------+---------+-----------------+-------+
| 1 | 1293876 | 123.123.123.123 | Dell |
+--------+---------+-----------------+-------+
| 2 | 4563456 | 234.234.234.234 | Dell |
+--------+---------+-----------------+-------+
| 3 | 7867657 | 345.345.345.345 | Apple |
+--------+---------+-----------------+-------+
items (upload_id is primary, unique and auto-increment)
+-----------+--------+-----+----------+------------+------+
| upload_id | itemId | inv | file | type | size |
+-----------+--------+-----+----------+------------+------+
| 56 | 777 | 123 | test.txt | text/plain | 266 |
+-----------+--------+-----+----------+------------+------+
| 57 | 777 | 123 | test.txt | text/plain | 266 |
+-----------+--------+-----+----------+------------+------+
| 58 | 777 | 123 | test.txt | text/plain | 266 |
+-----------+--------+-----+----------+------------+------+
Please forgive the messy code. I'm just trying to get the logic correct and then I can work on it.
Any advice is appreciated.
So with help from #Maximus2012 I was able to solve this.
I changed my create() function to return lastInsertId() in place of just true or false. Then, I assigned the value returned by the create() function to a variable rather than using a static value.
So my working code now looks like this;
public function create($inv, $ip, $make){
$stmt = $this->db->prepare("INSERT INTO items (inv,ip,make) VALUES (:inv,:ip,:make");
$stmt->bindparam(":inv", $inv);
$stmt->bindparam(":ip", $ip);
$stmt->bindparam(":make", $make);
$stmt->execute();
return $this->db->lastInsertId();
}
Then in my add-data.php page I simply changed one vcariable to the following;
$itemId = $crud->create($inv, $ip, $make);
Solved.

Querying results by categories and displaying everything as list

I've a table, basically its something like
------------------
id | name | type
------------------
1 | name1 | type1
2 | name2 | type2
3 | name3 | type3
------------------
I would like to query it and display it into something like
type1
- name1
- and so on...
type2
- name2
- and so on...
type3
- name 3
- and so on...
I am also looking to display it as a JSON file which is something like
[
{
"type1": {
"name": "name1"
},
"type2": {
"name": "name2"
},
"type3": {
"name": "name3"
}
}
]
So, may I know the best way to do it?
Using loops to query the type, then selecting the categories and displaying it by type?
Edit :
After searching high and low in the internet. I've found this : http://www.tommylacroix.com/2008/09/10/php-design-pattern-building-a-tree/
Hence formulated this code for my needs. Not sure if it's efficient :
<?php
$query = "SELECT * FROM TABLE";
$list = array();
$result = mysql_query($query);
while($row = mysql_fetch_array($result)){
if(!$list[$row['type']])
{
$list[$row['type']] = array();
}
array_push($list[$location['type']],&$row['name']) ;
}
?>
Try This.... also see the result output below
SELECT id,CONCAT_WS(',',type,name) as result FROM table GROUP BY id
<?php
$sql= mysql_query("SELECT id,CONCAT_WS(',',type,name) as result FROM table GROUP BY id");
$data = array();
$i=0;
while($row = mysql_fetch_array($sql)){
$ex_res = explode(",",$row['result']);
$data[$ex_res[0]]['name'] = $ex_res[1];
$i++;
}
$a = json_encode($data);
print_r($a);
?>
Result Output
{"type1":{"name":"name1"},"type2":{"name":"name2"},"type3":{"name":"name3"}}

Categories