How to repeat fuction in php? - php

Is it possible to repeat a function when it has finished. For Example: I have a function for export mysql to json file with a limit of 100 data. if it is successful create a json file with 100 data. Then it will repeat the same function to create json file 100 more data (no duplicate data) until the data runs out.
my code for generate json file :
$results = $db->SELECT()
->FROM( array( 'MM'=>'M_MEMBER'),
array( 'MEMBER_ID' => 'MM.MEMBER_ID',
'FIRST_NAME' => 'MM.FIRST_NAME',
'LAST_NAME' => 'MM.LAST_NAME',
'MEMBER_GROUP' => 'MM.MEMBER_GROUP',
'MEMBER_GROUP1' => 'MM.MEMBER_GROUP1',
'PHONE_NUMBER' => 'MM.PHONE_NUMBER',
'MEMBERSHIP' => 'MM.MEMBERSHIP',
'UPLOAD_DATE' => 'MM.UPLOAD_DATE',
'STATUS' => 'MM.STATUS'
)
)
->WHERE('DATE(MM.UPLOAD_DATE) = CURDATE()')
->WHERE('SYNC_FLAG = ?','N')
->LIMIT(100)
->QUERY()->FETCHALL();
if (!empty($results) && $results['SYNC_FLAG'] != 'Y')
{
$counter = formatNbr($counterFile);
$data = array();
foreach ($results as $key=>$row) {
$data[$key] = $row;
$data[$key]['_id'] = (string) Application_Helper_General::generateIdJsonFile();
$queryUdateMemberFlag = 'UPDATE M_MEMBER SET SYNC_FLAG = "Y" WHERE MEMBER_ID = '.$row['MEMBER_ID'].'';
$db->query($queryUdateMemberFlag);
}
$out = array_values($data);
$jsonAr = json_encode($out);
$json = substr($jsonAr, 1, -1);
$jsonData = preg_replace('/[\x00-\x1F\x80-\xFF]/', '', $json);
$file = $store_path_pos.'dataMember_'.date('Y-m-d').'_'.$counter.'.json';
$createJson = file_put_contents($file, $jsonData);
if($createJson){
echo "Create Json File Success In :".$file;
}else{
echo "Create Json Failed";
}
}
the code can only generate a json file once, how can it be repeated after generating a successful json file
note: I added a flag for each successful data generated json file

You can extract your codes to a function and use the function in the loop.
Example:
function exporter($limit = 100)
{
$results = $db->SELECT()
->FROM( array( 'MM'=>'M_MEMBER'),
array( 'MEMBER_ID' => 'MM.MEMBER_ID',
'FIRST_NAME' => 'MM.FIRST_NAME',
'LAST_NAME' => 'MM.LAST_NAME',
'MEMBER_GROUP' => 'MM.MEMBER_GROUP',
'MEMBER_GROUP1' => 'MM.MEMBER_GROUP1',
'PHONE_NUMBER' => 'MM.PHONE_NUMBER',
'MEMBERSHIP' => 'MM.MEMBERSHIP',
'UPLOAD_DATE' => 'MM.UPLOAD_DATE',
'STATUS' => 'MM.STATUS'
)
)
->WHERE('DATE(MM.UPLOAD_DATE) = CURDATE()')
->WHERE('SYNC_FLAG = ?','N')
->LIMIT($limit)
->QUERY()->FETCHALL();
if (!empty($results) && $results['SYNC_FLAG'] != 'Y')
{
$counter = formatNbr($counterFile);
$data = array();
foreach ($results as $key=>$row) {
$data[$key] = $row;
$data[$key]['_id'] = (string) Application_Helper_General::generateIdJsonFile();
$queryUdateMemberFlag = 'UPDATE M_MEMBER SET SYNC_FLAG = "Y" WHERE MEMBER_ID = '.$row['MEMBER_ID'].'';
$db->query($queryUdateMemberFlag);
}
$out = array_values($data);
$jsonAr = json_encode($out);
$json = substr($jsonAr, 1, -1);
$jsonData = preg_replace('/[\x00-\x1F\x80-\xFF]/', '', $json);
$file = $store_path_pos.'dataMember_'.date('Y-m-d').'_'.$counter.'.json';
$createJson = file_put_contents($file, $jsonData);
if($createJson) {
echo "Create Json File Success In :".$file;
} else {
echo "Create Json Failed";
}
}
$numberOfRows = 10000; # use the count query here
$limit = 100;
while($numberOfRows > 0) {
exporter($limit);
$numberOfRows -= $limit;
}
Also, you can call your function in your function recursively (https://www.w3schools.blog/php-recursive-functions)

Putting the existing code in a loop seems like the obvious answer here.
do {
$results = $db
->SELECT()
->FROM(
['MM'=>'M_MEMBER'],
[
'MEMBER_ID' => 'MM.MEMBER_ID',
'FIRST_NAME' => 'MM.FIRST_NAME',
'LAST_NAME' => 'MM.LAST_NAME',
'MEMBER_GROUP' => 'MM.MEMBER_GROUP',
'MEMBER_GROUP1' => 'MM.MEMBER_GROUP1',
'PHONE_NUMBER' => 'MM.PHONE_NUMBER',
'MEMBERSHIP' => 'MM.MEMBERSHIP',
'UPLOAD_DATE' => 'MM.UPLOAD_DATE',
'STATUS' => 'MM.STATUS',
]
)
->WHERE('DATE(MM.UPLOAD_DATE) = CURDATE()')
->WHERE('SYNC_FLAG = ?','N')
->LIMIT(100)
->QUERY()
->FETCHALL();
if (count($results) === 0) {
break;
}
$data = [];
foreach ($results as $row) {
$row['_id'] = (string) Application_Helper_General::generateIdJsonFile();
$data[] = $row;
//
// this is UNSAFE and inefficient, use a prepared statement if possible
//
$queryUdateMemberFlag = 'UPDATE M_MEMBER SET SYNC_FLAG = "Y" WHERE MEMBER_ID = '.$row['MEMBER_ID'].'';
$db->query($queryUdateMemberFlag);
}
$json = json_encode($out);
$filename = sprintf(
"%sdataMember_%s_%s.json",
$store_path_pos,
date("Y-m-d"),
formatNbr($counterFile)
);
if ($json && file_put_contents($filename, $json)) {
echo "Create Json File Success In $file";
} else {
echo "Create Json Failed";
}
} while (true);
I fixed a few inefficiencies in your code; I have no idea what DB library you're using but as a rule you should never inject variables into an SQL query. It's likely safe in this context, but if your library allows you should prepare the statement outside the loop and execute it inside the loop for better performance.

Related

Laravel maatwebsite excel package import 2000+ rows results in 1390 code error

I have 4 excel files that contains 1000 rows in every file. I merged it and make it 4000 so that I can save some time. But if I import the merged file it returns me an error of General error: 1390 Prepared statement contains too many placeholders. But when I insert them ony by one it works. I dont know why it return such error, they have even the same values in every column. Can someone tell me what to do with this error? Help would be appriciated. Thanks a lot
Im using laravel maatwebsite excel package.
My import code
public function import(Request $request)
{
if($request->hasFile('template')){
$path = $request->file('template')->getRealPath();
$data = \Excel::load($path)->get();
if($data->count() > 0){
$rows = $data->toArray();
foreach ($rows as $row) {
$level = '';
$stack = '';
$unit = '';
$gunit = '';
$street = '';
$block_unit = '';
//DONT MIND THIS FUNCTION HERE
if (strpos($row['address'], '#') !== false) {
$unit = explode("#",$row['address']);
$gunit = $unit[1];
$block = explode(" ",$unit[0],2);
if(isset($unit[1])) {
$x = explode("-",$unit[1]);
$level = $x[0];
$stack = $x[1];
$street = $block[1];
$block_unit= $block[0];
}
}
elseif(strpos($row['address'], ' ') !== false){
$unit = explode(" ",$row['address']);
$block = explode(" ",$unit[0],2);
if(isset($unit[1])) {
$x = explode("-",$unit[1]);
$level = '';
$stack = '';
$x = preg_replace('/[0-9]+/', '', $row['address']);
$street = $x.substr($street,1);
$block_unit= $block[0];
}
}
else{
$level = '';
$stack = '';
$unit = '';
$gunit = '';
$block_unit = '';
$street = '';
}
//END
$inserts[]=[
'transtype' => 'RESI',
'project_name' => $row['project_name'],
'unitname' => $gunit,
'block' => $block_unit,
'street' => $street,
'level' => $level,
'stack' => $stack,
'no_of_units' => $row['no._of_units'],
'area' => $row['area_sqm'],
'type_of_area' => $row['type_of_area'],
'transacted_price' => $row['transacted_price'],
'nettprice' => $row['nett_price'],
'unitprice_psm' => $row['unit_price_psm'],
'unitprice_psf' => $row['unit_price_psf'],
'sale_date' => $row['sale_date'],
'contract_date' => $row['sale_date'],
'property_type' => $row['property_type'],
'tenure' => $row['tenure'],
'completion_date' => $row['completion_date'],
'type_of_sale' => $row['type_of_sale'],
'purchaser_address_indicator' => $row['purchaser_address_indicator'],
'postal_district' => $row['postal_district'],
'postal_sector' => $row['postal_sector'],
'postal_code' => $row['postal_code'],
'planning_region' => $row['planning_region'],
'planning_area' => $row['planning_area'],
];
}
}
if(empty($inserts)){
dd('Request data does not have any files to import.');
}
else {
\DB::table('xp_pn_ura_transactions')->insert($inserts);
dd('record inserted');
}
}
}
You can not insert more than 1000 records using laravel insert() method. If you want to achieve expected output then you can use array_chunk function of php or collection of laravel.
For example using array_chunk :
$chuncked = array_chunk($inserts, 1000);
foreach($chuncked as $insert){
\DB::table('xp_pn_ura_transactions')->insert($insert);
}

How to encode json with multiple rows?

Before I begin, I have looked through other examples and Q&A's on multiple platforms but none of them seem to solve my problem. I am trying to return multiple rows from MySQL via a json. However, I have been unable to. The code below shows my attempt.
I get my responses via Postman. The first while returns only the last entry in the database, and the do-while returns all entries but doesn't encode the json properly, as the json outputs syntax error but the html part shows all entries.
<?php
$dashboard_content_token = $_REQUEST["dashboard_content_token"];
$token = "g4";
require(cc_scripts/connect.php);
$sql = "SELECT * FROM `dashboard_content`";
$check = strcmp("$token", "$dashboard_content_token");
$statement = mysqli_query($con, $sql);
if (check) {
$rows = mysqli_fetch_assoc($statement);
if (!$rows) {
echo "No results!";
} else {
while ($rows = mysqli_fetch_assoc($statement)) {
$news_id = $rows['news_id'];
$image_url = $rows['image_url'];
$news_title = $rows['news_title'];
$news_description = $rows['news_description'];
$news_article = $rows['news_article'];
$result['dashboard content: '][] = array('news_id' => $news_id, 'image_url' => $image_url, 'news_title' => $news_title, 'news_description' => $news_description, 'news_article' => $news_article);
echo json_encode($result);
}
// do {
// $news_id = $rows['news_id'];
// $image_url = $rows['image_url'];
// $news_title = $rows['news_title'];
// $news_description = $rows['news_description'];
// $news_article = $rows['news_article'];
// $result['dashboard content: '][] = array('news_id' => $news_id, 'image_url' => $image_url, 'news_title' => $news_title, 'news_description' => $news_description, 'news_article' => $news_article);
// echo json_encode($result);
// } while ($rows = mysqli_fetch_assoc($statement));
mysqli_free_result($statement);
}
}
?>
This should work. You'll want to use the do...while statement otherwise the first result is skipped.
<?php
$dashboard_content_token = $_REQUEST["dashboard_content_token"];
$token = "g4";
require(cc_scripts/connect.php);
$sql = "SELECT * FROM `dashboard_content`";
$check = strcmp("$token", "$dashboard_content_token");
$statement = mysqli_query($con, $sql);
if (check) {
$rows = mysqli_fetch_assoc($statement);
if (!$rows) {
echo "No results!";
} else {
do {
$news_id = $rows['news_id'];
$image_url = $rows['image_url'];
$news_title = $rows['news_title'];
$news_description = $rows['news_description'];
$news_article = $rows['news_article'];
$result['dashboard content: '][] = array('news_id' => $news_id, 'image_url' => $image_url, 'news_title' => $news_title, 'news_description' => $news_description, 'news_article' => $news_article);
} while ($rows = mysqli_fetch_assoc($statement));
mysqli_free_result($statement);
echo json_encode($result);
}
}
?>
The key is to put all of you results into an array and then just do one json_encode(). When you call json_encode() multiple times, your API will return invalid json.
In your while loop,
$result['dashboard content: '] = array('news_id' => $news_id, 'image_url' => $image_url, 'news_title' => $news_title, 'news_description' => $news_description, 'news_article' => $news_article);
just over-writes the same "dashboard content" entry in the $result array every time you run the loop. This is why you only see the last entry.
Doing json_encode() within the loop makes no sense as well, because you'll just output multiple, disconnected individual JSON objects, which are not part of an array or coherent structure. This doesn't make for a valid JSON response.
It's not abundantly clear exactly what output structure you're hoping for, but this might give you either a solution, or at least a shove in the right direction:
$statement = mysqli_query($con, $sql);
$result = array("dashboard_content" => array()); //create an associative array with a property called "dashboard_content", which is an array. (json_encode will convert an associative array to a JSON object)
if (check) {
$rows = mysqli_fetch_assoc($statement);
if (!$rows) {
echo "No results!";
} else {
while ($rows = mysqli_fetch_assoc($statement)) {
$news_id = $rows['news_id'];
$image_url = $rows['image_url'];
$news_title = $rows['news_title'];
$news_description = $rows['news_description'];
$news_article = $rows['news_article'];
//append the current data to a new entry in the "dashboard_content" array
$result["dashboard_content"][] = array('news_id' => $news_id, 'image_url' => $image_url, 'news_title' => $news_title, 'news_description' => $news_description, 'news_article' => $news_article);
}
}
//now, output the whole completed result to one single, coherent, valid JSON array.
echo json_encode($result);
You should end up with some JSON like this:
{
"dashboard_content": [
{
"news_id": 1,
"image_url": "abc",
"news_title": "xyz",
//...etc
},
{
"news_id": 2,
"image_url": "def",
"news_title": "pqr",
//...etc
},
//...etc
]
}

How to convert a multidimensional associative array to JSON?

As you can see in the JSON file, this looks not so necessary.
I get values from input-s and add it to the array and send it via AJAX. A simple array I know how to convert, but a multidimensional one is not. Can eat what that function? I tried to create an array with "keys", but there is a lot of trouble, I never reached the end , and I'm sure it's not right. Tell me what you can do.
i want this
{
"user1" : {
first_name":"Michael",
"last_name":"Podlevskykh",
"phones" : {
"phone_1":"5345",
"phone_2":"345345",
"phone_3":"123"
}
}
}
//this is what i see
JSON
[
{"first_name":"Michael"},
{"last_name":"Podlevskykh"},
[{"phone_1":"5345"},
{"phone_2":"345345"},
{"phone_3":"0991215078"}
]
]
PHP
//[["5345", "345345", "123"], "Michael", "Podlevskykh"]
$userInfo = (json_decode($_POST["phones"], true));
$namePhones = ["phone_1", "phone_2", "phone_3"];
$nameUser = ["first_name", "last_name"];
$jsonPhones = $userInfo;
$nameLName = $userInfo;
$jsonPhones = array_splice($jsonPhones, 0, 1);
$nameLName = array_splice($nameLName, -2);
foreach ($jsonPhones[0] as $key => $value) {
$phones[] = array($namePhones[$key] => $jsonPhones[0][$key]);
}
foreach ($nameLName as $key => $value) {
$usersName[] = array($nameUser[$key] => $nameLName[$key]);
}
array_push($usersName, $phones);
echo "<pre>";
echo json_encode($usersName);
//[
// {"first_name":"Michael"},{"last_name":"Podlevskykh"},
// [{"phone_1":"5345"},{"phone_2":"345345"},{"phone_3":"123"}]
//]
I don't get all the complications, I would do something like this if I'm sure the $input format is the same:
<?php
$input = '[["5345", "345345", "123"], "Michael", "Podlevskykh"]';
$input = json_decode($input, true);
$output = [
'user1' => [
'first_name' => $input[1],
'last_name' => $input[2],
'phones' => [
'phone_1' => $input[0][0],
'phone_2' => $input[0][1],
'phone_3' => $input[0][2]
]
]
];
echo '<pre>';
echo json_encode($output);
If you want an object as output, you need to create an object:
$userInfo = (json_decode($_POST["phones"], true));
$namePhones = ["phone_1", "phone_2", "phone_3"];
$nameUser = ["first_name", "last_name"];
$jsonPhones = $userInfo;
$nameLName = $userInfo;
$jsonPhones = array_splice($jsonPhones, 0, 1);
$nameLName = array_splice($nameLName, -2);
$user = new stdClass();
foreach ($nameLName as $key => $value) {
$user->{$nameUser[$key]} = $nameLName[$key];
}
$phones = new stdClass();
foreach ($jsonPhones[0] as $key => $value) {
$phones->{$namePhones[$key]} = $jsonPhones[0][$key];
}
$user->phones = $phones;
$users = new stdClass();
$users->user1 = $user;
echo json_encode($users);
Output:
{"user1": {
"first_name":"Michael",
"last_name":"Podlevskykh",
"phones":{
"phone_1":"5345",
"phone_2":"345345",
"phone_3":"123"
}
}
}

CakePHP 3 fast insertion/updating of records

I am trying to insert/update +/- 10k rows with a foreach loop. The complete loop duration is about 3-5minutes. Are there any tips on my code to do the insertion of update faster? The $rows are retrieved from a xls file converted to domdocument.
foreach($rows as $key => $row)
{
if($key < 1){continue;}
$cells = $row -> getElementsByTagName('td');
foreach ($cells as $cell) {
$project_id = $cells[0]->nodeValue;
$title = $cells[1]->nodeValue;
$status = $cells[2]->nodeValue;
$projectmanager = $cells[3]->nodeValue;
$engineer = $cells[4]->nodeValue;
$coordinator = $cells[5]->nodeValue;
$contractor_a = $cells[6]->nodeValue;
$contractor_b = $cells[7]->nodeValue;
$gsu = $cells[9]->nodeValue;
$geu = $cells[10]->nodeValue;
$query = $this->Projects->find('all')->select(['project_id'])->where(['project_id' => $project_id]);
if ($query->isEmpty()) {
$project = $this->Projects->newEntity();
$project->title = $title;
$project->project_id = $project_id;
$project->status = $status;
$project->projectmanager = $projectmanager;
$project->engineer = $engineer;
$project->coordinator = $coordinator;
$project->contractor_a = $contractor_b;
$project->contractor_b = $contractor_a;
$project->gsu = date("Y-m-d H:i:s");
$project->geu = date("Y-m-d H:i:s");
$project->gsm = date("Y-m-d H:i:s");
$project->gem = date("Y-m-d H:i:s");
if ($this->Projects->save($project)) {
//$this->Flash->success(__('The project has been saved.'));
continue;
}else{
debug($project->errors());
}
}else{
continue;
$query->title = $title;
$query->status = $status;
$query->projectmanager = $projectmanager;
$query->engineer = $engineer;
$query->coordinator = $coordinator;
$query->contractor_a = $contractor_b;
$query->contractor_b = $contractor_a;
$query->gsu = $gsu;
$query->geu = $geu;
if ($this->Projects->save($query)) {
//$this->Flash->success(__('The project has been saved.'));
continue;
}
}
}
//$this->Flash->error(__('The project could not be saved. Please, try again.'));
}
For faster bulk inserts don't use entities but rather generate insert queries directly.
https://book.cakephp.org/3.0/en/orm/query-builder.html#inserting-data
Ello, my vriend.
The TableClass->save() method is useful when saving one single record, in your case, you should use TableClass->saveMany() instead.
For this to happen, you need to treat your entities as arrays inside your foreach.
After the foreach, you will use another method from the tableclass (newEntities) to convert the array into entities before finally save them.
Basic example:
//Lets supose our array after our foreach become something like this:
$all_records =
[
//Each item will be an array, not entities yet
[
'name' => 'I.N.R.I.',
'year' => '1987',
'label' => 'Cogumelo',
'country' => 'Brazil',
'band' => 'Sarcófago',
'line_up' => '[{"name":"Wagner Antichrist","role":"Vomits, Insults"},{"name":"Gerald Incubus","role":"Damned Bass"},{"name":"Z\u00e9der Butcher","role":"Rotten Guitars"},{"name":"D.D. Crazy","role":"Drums Trasher"}]'
],
//Another record coming in..
[
'name' => 'Eternal Devastation',
'year' => '1986',
'label' => 'Steamhammer',
'country' => 'Germany',
'band' => 'Destruction',
'line_up' => '[{"name":"Marcel Schmier","role":"Vocals, Bass"},{"name":"Mike Sifringer","role":"Guitars"},{"name":"Tommy Sandmann","role":"Drums"}]'
]
];
//Time to get the tableclass...
$albums = TableRegistry::get('Albums');
//Time to transform our array into Album Entities
$entities = $albums->newEntities($all_records);
//Now, we have transformed our array into entities on $entities, this is the variable we need to save
if(!$albums->saveMany($entities))
{
echo "FellsBadMan";
}
else
{
echo "FellsGoodMan";
}
You can read more about here

Storing data in a session, can only seem to store 1 piece of data at a time

I am trying to allow the user to create a shortlist, however I add one thing the shortlist, and it overwrites the whatever else is in there, what am I doing wrong?
function validate_add_cart_item($id){
$this->db->select('candidates.candidate_id, candidates.first_name, candidates.surname, candidates.DOB, candidates.talent, candidates.location, candidates.availability_order, candidates.availability, candidates.availability_comments, candidate_assets.url, candidate_assets.asset_size')
->from('candidates')
->join('candidate_assets', 'candidate_assets.candidates_candidate_id = candidates.candidate_id', 'left')
->where('candidate_assets.asset_size', 'small')
->where('candidate_assets.asset_type', 'image')
->where('candidates.candidate_id', (int)$id)
->limit(1)
->order_by('candidates.availability_order', 'DESC');
$query = $this->db->get();
//die($this->db->last_query());
// Check if a row has been found
if($query->num_rows > 0 ){
foreach ($query->result() as $row)
{
$data = array(
'id' => $id,
'name' => $row->first_name." ".$row->surname,
'location' => $row->location,
'talent' => $row->talent,
'image' => $row->url
);
$this->session->set_userdata('shortlist', array($data));
return TRUE;
}
// Nothing found! Return FALSE!
} else {
return FALSE;
}
foreach(...){
// ...
$this->session->set_userdata('shortlist', array($data));
}
You are overwriting your previous userdata several times. Instead of this you should create an array, and call set_userdata once.
$udata = array();
foreach(...){
// ...
$udata []= $data;
}
$this->session->set_userdata('shortlist', $udata);
edit:
And your query returns only one row, you probably want to update your array, not overwrite it, so something like this:
$udata = $this->session->get_userdata('shortlist');
foreach(...){
// ...
$udata []= $data;
}
$this->session->set_userdata('shortlist', $udata);
$this->session->set_userdata('shortlist', array($data));
doesn't append $data to shortlist but overwrites it with an array that has one element: the current data in $data.
You probably (untested) want:
if($query->num_rows > 0 ) {
$data = array();
foreach ($query->result() as $row)
{
// append the new "record" to the array $data
$data[] = array(
'id' => $id,
'name' => $row->first_name." ".$row->surname,
'location' => $row->location,
'talent' => $row->talent,
'image' => $row->url
);
}
$this->session->set_userdata('shortlist', $data);
return TRUE;
}
else {
...

Categories