jquery request running more than once - php

I have this code to make a request to a file that process some xls files ( with phpexcel library ) :
$("#applyCalls").click(function(){
$.get("admin/processCalls.php");
});
the processCalls.php looks like this:
include '../phpexcel2/Classes/PHPExcel.php';
include '../phpexcel2/Classes/PHPExcel/IOFactory.php';
date_default_timezone_set('Europe/Bucharest');
$fisierInbound = "inbound.xls";
$fisierOutbound = "outbound.xls";
callsInbound();
function callsInbound() {
global $fisierInbound;
global $current;
global $con;
$identify = PHPExcel_IOFactory::identify('daily/' . $fisierInbound);
$objReader = PHPExcel_IOFactory::createReader($identify);
$objReader->setReadDataOnly(true);
$objPHPExcel = $objReader->load('daily/' . $fisierInbound);
$worksheet = $objPHPExcel->setActiveSheetIndex(0);
$query = mysqli_query($con, "SELECT * FROM users WHERE tip_user='agent' AND NOT (departament='online')");
while($usr = mysqli_fetch_array($query)) {
$data[] = $usr;
}
$worksheetTitle = $worksheet->getTitle();
$highestRow = $worksheet->getHighestRow(); // e.g. 10
$highestColumn = $worksheet->getHighestColumn(); // e.g 'F'
$highestColumnIndex = PHPExcel_Cell::columnIndexFromString($highestColumn);
$dataCalls = $worksheet->getCellByColumnAndRow(2, 2)->getValue();
$dataSubstr = substr($dataCalls, 53);
echo $dataSubstr;
foreach ($data as $db) {
for ($row = 6; $row <= $highestRow; ++ $row) {
$marca = $worksheet->getCellByColumnAndRow(3, $row)->getValue();
$nr = $worksheet->getCellByColumnAndRow(4, $row)->getValue();
if($db['marca'] == $marca) {
mysqli_query($con, "INSERT INTO dailycalls(data, nr, departament, oras, tip, id_user)
VALUES('$dataSubstr', '$nr', '" . $db['departament'] . "', '" . $db['oras'] . "', 'inbound', '" . $db['id'] . "')");
}
}
}
}
The thing is that I get "No data receive Error code: ERR_EMPTY_RESPONSE" (chrome) from processCalls, so the ajax request fails. As it fails I think it is running the file more than once, because I get more results than I need in the db. If I run the file manually I get the results I need. ( side note: the processCalls.php's query runs despite the error I get ). I hope that makes sense.
Thanks!

Check if the click function is getting binded more than once.
If the click function is binded more than once the function will be called multiple times.
$("#applyCalls").off('click', callServer).on('click', callServer);
var callServer = function(){
$.get("admin/processCalls.php");
};

Have you tried this?
$("#applyCalls").unbind("click").click(function(){$.get("admin/processCalls.php");});

Try to use stopPropagation() function.
Also as your PHP doesn't return anything you should use POST instead of GET
To understand the difference between POST and GET have a look at this article
$("#applyCalls").click(function(event){
event.stopPropagation()
$.post("admin/processCalls.php");
});

Related

Foreach loop not working properly in my Laravel 9 Code?

Screen Shot of HTML Form.
Can you suggest me where am I doing mistakes in my Laravel app, each time i am uploading multiple files throug the form, files is uploading perfectly in given location, but only 1 records are being inserted into database table. Here is my code...
// Upload bg_certificateArr photo
if ($request->hasFile('bg_certificate')) {
foreach ($request->file('bg_certificate') as $key => $file) {
// Get image extention
$extention = $file->getClientOriginalExtension();
// Generate new image name
$bgCertImgName = 'ac-bg-cert-' . date('Y-m-d-H-i-s') . '.' . $extention;
$bgCertImgPath = 'accountant/images/bank_guarantee/' . $bgCertImgName;
// Upload Profile Image
Image::make($file)->save($bgCertImgPath);
// Insert data into bank guarantee table
$bg_amountArr = $data['bg_amount'];
$bg_numberArr = $data['bg_number'];
$bank_idArr = $data['bank_id'];
$bg_from_dateArr = $data['bg_from_date'];
$bg_to_dateArr = $data['bg_to_date'];
$bg_amount = $bg_amountArr[$key];
$bg_number = $bg_numberArr[$key];
$bank_id = $bank_idArr[$key];
$bg_from_date = $bg_from_dateArr[$key];
$bg_to_date = $bg_to_dateArr[$key];
// echo '<pre>';
// print_r($bg_to_date);
// die();
$ac_bg->school_id = $schoolID;
$ac_bg->ledgers_id = $acLedger->id;
$ac_bg->bg_amount = $bg_amount;
$ac_bg->bg_number = $bg_number;
$ac_bg->bank_id = $bank_id;
$ac_bg->bg_from_date = $bg_from_date;
$ac_bg->bg_to_date = $bg_to_date;
$ac_bg->bg_certificate = $bgCertImgName;
$ac_bg->save();
}
}
initialize $ac_bg for every foreach loop.
if ($request->hasFile('bg_certificate')) {
foreach ($request->file('bg_certificate') as $key => $file) {
$ac_bg = new AccountBankGuarantee; // like this
// Get image extention
$extention = $file->getClientOriginalExtension();
// Generate new image name
$bgCertImgName = 'ac-bg-cert-' . date('Y-m-d-H-i-s') . '.' . $extention;
$bgCertImgPath = 'accountant/images/bank_guarantee/' . $bgCertImgName;
// Upload Profile Image
Image::make($file)->save($bgCertImgPath);
// Insert data into bank guarantee table
$bg_amountArr = $data['bg_amount'];
$bg_numberArr = $data['bg_number'];
$bank_idArr = $data['bank_id'];
$bg_from_dateArr = $data['bg_from_date'];
$bg_to_dateArr = $data['bg_to_date'];
$bg_amount = $bg_amountArr[$key];
$bg_number = $bg_numberArr[$key];
$bank_id = $bank_idArr[$key];
$bg_from_date = $bg_from_dateArr[$key];
$bg_to_date = $bg_to_dateArr[$key];
// echo '<pre>';
// print_r($bg_to_date);
// die();
$ac_bg->school_id = $schoolID;
$ac_bg->ledgers_id = $acLedger->id;
$ac_bg->bg_amount = $bg_amount;
$ac_bg->bg_number = $bg_number;
$ac_bg->bank_id = $bank_id;
$ac_bg->bg_from_date = $bg_from_date;
$ac_bg->bg_to_date = $bg_to_date;
$ac_bg->bg_certificate = $bgCertImgName;
$ac_bg->save();
}
}
You haven't posted full code here, but I believe before loop you somehow set $ac_bg variable. So looking at your code you create one record and update it in every next loop iteration.
So to fix this you should probably do something like this:
foreach ($request->file('bg_certificate') as $key => $file) {
$ac = new Ac(); // don't know what's the exact model class here
Then in every loop iteration you will create new record instead of updating it.

Why is do while loop creating/overwriting 2 separate arrays

I have the following code that is overwriting my array on the second pass through of the while loop.
Here is my code:
<?php
require '../vendor/autoload.php';
require_once 'constants/constants.php';
use net\authorize\api\contract\v1 as AnetAPI;
use net\authorize\api\controller as AnetController;
require('includes/application_top.php');
define("AUTHORIZENET_LOG_FILE", "phplog");
function getUnsettledTransactionList()
{
//get orders that are in the exp status
$orders_pending_query = tep_db_query("select orders_id as invoice_number from " . TABLE_ORDERS . " where orders_status = '14' order by invoice_number");
$orders_pending = array();
while ($row = mysqli_fetch_array($orders_pending_query, MYSQLI_ASSOC)) {
$orders_pending[] = $row;
}
/* Create a merchantAuthenticationType object with authentication details
retrieved from the constants file */
$merchantAuthentication = new AnetAPI\MerchantAuthenticationType();
$merchantAuthentication->setName(\SampleCodeConstants::MERCHANT_LOGIN_ID);
$merchantAuthentication->setTransactionKey(\SampleCodeConstants::MERCHANT_TRANSACTION_KEY);
// Set the transaction's refId
$refId = 'ref' . time();
$pagenum = 1;
do {
$request = new AnetAPI\GetUnsettledTransactionListRequest();
$request->setMerchantAuthentication($merchantAuthentication);
$paging = new AnetAPI\PagingType;
$paging->setLimit("1000");
$paging->setOffset($pagenum);
$request->setPaging($paging);
$controller = new AnetController\GetUnsettledTransactionListController($request);
$response = $controller->executeWithApiResponse(\net\authorize\api\constants\ANetEnvironment::PRODUCTION);
$transactionArray = array();
$resulttrans = array();
if (($response != null) && ($response->getMessages()->getResultCode() == "Ok")) {
if (null != $response->getTransactions()) {
foreach ($response->getTransactions() as $tx) {
$transactionArray[] = array(
'transaction_id' => $tx->getTransId(),
'invoice_number' => $tx->getInvoiceNumber()
);
// echo "TransactionID: " . $tx->getTransId() . "order ID:" . $tx->getInvoiceNumber() . "Amount:" . $tx->getSettleAmount() . "<br/>";
}
$invoiceNumbers = array_column($orders_pending, "invoice_number");
$result = array_filter($transactionArray, function ($x) use ($invoiceNumbers) {
return in_array($x["invoice_number"], $invoiceNumbers);
});
$resulttrans = array_column($result, "transaction_id");
} else {
echo "No unsettled transactions for the merchant." . "\n";
}
} else {
echo "ERROR : Invalid response\n";
$errorMessages = $response->getMessages()->getMessage();
echo "Response : " . $errorMessages[0]->getCode() . " " . $errorMessages[0]->getText() . "\n";
}
$numResults = (int) $response->getTotalNumInResultSet();
$pagenum++;
print_r($resulttrans);
} while ($numResults === 1000);
return $resulttrans;
}
getUnsettledTransactionList();
?>
the print_r($resulttrans); is actually printing 2 separate arrays, instead of my desired 1 array.
If I move the print_r($resulttrans) to after the while loop, I am only seeing the second array, meaning the first array was overwritten. I am not seeing where this is happening though as to me it seems like all results should be added onto the array.
Your code is supposed to work as you described because you are reassigning the array variable in your loop like this
$resulttrans = array_column($result, "transaction_id");
If you need to get all the resulting values in the same array you need to append it to the array. you can do that by merging the new result into your array variable like this
$resulttrans = array_merge($resulttrans, array_column($result, "transaction_id"));

Multiple CSV File Creation in PHP

I have a code below
<?php
/* Error display */
error_reporting(E_ALL);
ini_set('display_errors', 1);
ini_set('memory_limit', '512M');
/* Requires */
require 'conn.php';
/* Parameters (DIM) */
$param_customer = $_POST['param_customer'];
$param_user = $_POST['param_user'];
/* Others */
$param_email = $_POST['email'];
$file_dump_area = "../general_sync/";
/* Array */
$jsonData = array();
$arr_result = array();
/******************************** Download customer *********************************/
$cur_filename = $file_dump_area . removeCharEmail($param_email) . "_" . $param_customer . ".csv";
$cur_file = fopen($cur_filename, "w");
$cur_sql = "CALL android_getCustomer('" .$param_email. "')";
$cur_result = mysqli_query($con,$cur_sql);
if ($cur_file && $cur_result) {
while ($row = $cur_result->fetch_array(MYSQLI_NUM)) {
fputcsv($cur_file, array_values($row));
}
array_push($arr_result, array('done_process' => "done_cus"));
}
fclose($cur_file);
/******************************** Download user *********************************/
$cur_filename = $file_dump_area . removeCharEmail($param_email) . "_" . $param_customer . "1.csv";
$cur_file = fopen($cur_filename, "w");
$cur_sql = "CALL android_getCustomer('" .$param_email. "')";
$cur_result = mysqli_query($con,$cur_sql);
if ($cur_file && $cur_result) {
while ($row = $cur_result->fetch_array(MYSQLI_NUM)) {
fputcsv($cur_file, array_values($row));
}
array_push($arr_result, array('done_process' => "done_user"));
}
fclose($cur_file);
$jsonData = array("received"=>$arr_result);
echo json_encode($jsonData,JSON_PRETTY_PRINT);
function removeCharEmail($val) {
$new_val1 = str_replace(".", "", $val);
$new_val2 = str_replace("#", "", $new_val1);
return $new_val2;
}
?>
The target output of that code is to create 2 csv which is it does but the problem is the 2nd csv has no data although the query shows some it does not write. I tried to copy the 1st line of codes. it does create the file but it didnt write
Whats the problem?
Updated with help of Mr. Barmar
i got this error Commands out of sync; you can't run this command now
The stored procedure is apparently returning two result sets. You need to fetch the next result set before you can start another query. Add:
$cur_result->close();
$con->next_result();
After each loop that fetches the results. See https://stackoverflow.com/a/14561639/1491895 for more details.

JSON increment doesn't work

I am working on a JSON file in which I want to increment instead of overwriting the data. However I cannot seem to do this (since it keeps on overwriting instead of adding data with an incremented ID).
The code underneath is: database_json.php (as u can see I include it in saveJson.php)
$databaseFile = file_get_contents('json_files/database.json');
$databaseJson = json_decode($databaseFile, true);
$database = $databaseJson['data'];
the below is the code of the file saveJson.php contains the following code:
// below starts a new page, the page that submits the form called saveJson.php
include_once('database_json.php');
$data = $_POST;
//Setup an empty array.
$errors = array();
if (isset($data)) {
$newExerciseData = $data;
$exerciseArray = $data['main_object'];
$databaseFile = 'json_files/database.json';
$textContent = file_get_contents($databaseFile);
$database = json_decode($textContent, true);
if ($data['id'] === 'new') {
if (count($database['data']) == 0) {
$ID = 0;
}
else {
$maxID = max($database['data']);
$ID = ++$maxID["id"];
}
$newJsonFile = 'jsonData_' . $ID . '.json';
$newJsonFilePath = 'json_files/' . $newJsonFile;
//Create new database exercise_txt
$newArrayData = array(
'id' => $ID,
// a lot of variables that aren't related to the problem
);
$database['data'][] = $newArrayData;
file_put_contents($databaseFile, json_encode($database, JSON_UNESCAPED_UNICODE, JSON_PRETTY_PRINT));
file_put_contents($newJsonFilePath, json_encode($newExerciseData, JSON_UNESCAPED_UNICODE, JSON_PRETTY_PRINT));
}
else {
$index = array_search((int) $_POST['id'], array_column($database['data'], 'id'));
$correctJsonFile = 'json_files/jsonData_' . $_POST['id'] . '.json';
$newJsonFile = 'jsonData_' . $_POST['id'] . '.json';
$newJsonFilePath = 'json_files/' . $newJsonFile;
//Create new database exercise_txt
$newArrayData2 = array(
'id' => (int) $_POST['id'],
// more not related to problem variables
);
$database['data'][$index] = $newArrayData2;
file_put_contents($databaseFile, json_encode($database, JSON_UNESCAPED_UNICODE));
file_put_contents($newJsonFilePath, json_encode($newExerciseData, JSON_UNESCAPED_UNICODE));
}
echo json_encode($newExerciseData, JSON_UNESCAPED_UNICODE);
}
so, what I wish for: To increment the IDs and NOT to overwrite the data.
I already did some research and didn't find any useful information besides this --> Auto increment id JSON, but to me it looks like I have the same principle applied.

mysql_fetch_array only returns last value of query [duplicate]

This question already has answers here:
How to store values from foreach loop into an array?
(9 answers)
Closed 1 year ago.
Basically I want to make a download button for my project that will produce different csv files (one csv file per table) in memory and zip before download. It works fine but the problem is that I am only getting one row (the last result) on each mysql_fetch_array where it is supposed to return rows depending on how many are stored in the database. This code is depreciated, sorry for that.
Here is my code:
<?php
require("../includes/connection.php");
require("../includes/myLib.php");
//get the ID that is passed
$ID = $_REQUEST['q'];
$ID = xdec($ID);
//All queries to fetch data
$query = mysql_query("SELECT * FROM `ge2`.`projects` WHERE `projects`.`proj_id`='$ID'");
$query2 = mysql_query("SELECT * FROM `ge2`.`attributes` WHERE `attributes`.`proj_id`='$ID'");
$query3 = mysql_query("SELECT * FROM `ge2`.`category` WHERE `category`.`proj_id`='$ID'");
$query4 = mysql_query("SELECT * FROM `ge2`.`multipletarget` WHERE `multipletarget`.`proj_id`='$ID'");
$query5 = mysql_query("SELECT * FROM `ge2`.`data_cut` WHERE `data_cut`.`proj_id`='$ID'");
$query6 = mysql_query("SELECT * FROM `ge2`.`raw` WHERE `raw`.`proj_id`='$ID'");
//getting all array
while ($row = mysql_fetch_array($query)) {
$proj_alias = $row['proj_alias'];
$proj_id = $row['proj_id'];
$date_added = $row['date_added'];
}
while ($row1 = mysql_fetch_array($query2)) {
$attrib_param_id = $row1['param_id'];
$attrib_proj_id = $row1['proj_id'];
$attrib_cat_id = $row1['cat_id'];
$attrib_val_id = $row1['val_id'];
$attrib_name = $row1['name'];
$attrib_isCust = $row1['isCust'];
}
while ($row2 = mysql_fetch_array($query3)) {
$category_cat_id = $row2['cat_id'];
$category_name = $row2['name'];
$category_proj_id = $row2['proj_id'];
$category_desc = $row2['desc'];
}
while ($row3 = mysql_fetch_array($query4)) {
$multipletarget_id = $row3['id'];
$multipletarget_proj_id = $row3['proj_id'];
$multipletarget_mtarget1 = $row3['mtarget1'];
$multipletarget_mtarget2 = $row3['mtarget2'];
}
while ($row4 = mysql_fetch_array($query5)) {
$data_cut_id = $row4['id'];
$data_cut_proj_id = $row4['proj_id'];
$data_cut_name = $row4['name'];
$data_cut_param = $row4['param'];
$data_cut_lvl = $row4['lvl'];
$data_cut_val = $row4['val'];
}
while ($row5 = mysql_fetch_array($query6)) {
$raw_id = $row5['raw_id'];
$raw_proj_id = $row5['proj_id'];
$raw_p_id = $row5['p_id'];
$raw_url = $row5['url'];
$raw_ip = $row5['ip'];
$raw_pos = $row5['pos'];
$raw_datetaken = $row5['datetaken'];
$raw_used = $row5['used'];
$raw_fdc_id = $row5['fdc_id'];
$raw_dq = $row5['dq'];
}
// some data to be used in the csv files
$records = array(
$proj_alias, $proj_id, $date_added
);
$records2 = array(
$attrib_param_id, $attrib_proj_id, $attrib_cat_id, $attrib_val_id, $attrib_name, $attrib_isCust
);
$records3 = array(
$category_cat_id, $category_name, $category_proj_id, $category_desc
);
$records4 = array(
$multipletarget_id, $multipletarget_proj_id, $multipletarget_mtarget1, $multipletarget_mtarget2
);
$records5 = array(
$data_cut_id, $data_cut_proj_id, $data_cut_name, $data_cut_param,$data_cut_lvl,$data_cut_val
);
$records6 = array(
$raw_id, $raw_proj_id, $raw_p_id, $raw_url,$raw_ip,$raw_pos,$raw_datetaken,$raw_used,$raw_fdc_id,$raw_dq
);
//making an array to be used in loop
$set = array($records,$records2,$records3,$records4,$records5,$records6);
//names to be named for each csv file
$names = array('projects', 'attributes', 'category', 'multipletarget', 'data_cut', 'raw');
// create zip file
$zipname = $proj_alias;
$zip = new ZipArchive;
$zip->open($zipname, ZipArchive::CREATE);
// loop to create csv files
$n = 0;
foreach ($set as $setk => $sets) {
$n += 1;
$fd = fopen('php://temp/maxmemory:1048576', 'w');
if (false === $fd) {
die('Failed to create temporary file');
}
fputcsv($fd, $sets);
// return to the start of the stream
rewind($fd);
// add the in-memory file to the archive, giving a name
$zip->addFromString('BrainLink-' . $proj_alias . "-" . $names[$setk] . '.csv', stream_get_contents($fd));
//close the file
fclose($fd);
}
// close the archive
$zip->close();
header('Content-Type: application/zip');
header('Content-disposition: attachment; filename='.$zipname.'.zip');
header('Content-Length: ' . filesize($zipname));
readfile($zipname);
// remove the zip archive
// you could also use the temp file method above for this.
unlink($zipname);
?>
Thanks in advance.
Well, it seems that you're independently iterating over all query results and overwriting the variables over and over again. So in the end, you have only last table results to work with.
You might try using JOIN OR UNION SELECT in MySQL to get all the items in one big query result row:
$query = mysql_query('SELECT proj.*, attrs.*
FROM `projects` AS proj
JOIN `attributes` AS attrs ON (attrs.project_id=proj.project_id)
<..more tables to join in the same manner..>
WHERE proj.`proj_id`= ' . $projectId);
And then, you'll just have to iterate only over a single query resource.
while ($row = mysql_fetch_array($query)) {
//your code here
}
Note, that if tables being JOIN'ed have the same column names, they will "overwrite" each other and you'll have to rename them yourself "on the fly".
Like so:
SELECT proj.field, proj.another_field, proj.conflicting_field_name AS unique_field_name
I did not read all of your code, but in each while loop, you only save last record. It should be something like this.
while($row = mysql_fetch_array($query)){
$proj_alias[] = $row['proj_alias'];
$proj_id[] = $row['proj_id'];
$date_added[] = $row['date_added'];
}
and the others like above.

Categories