<?php
include_once '../includes/db_connect.php';
fetch_evt_values($conn, 7475, 2, 16);
function fetch_evt_values($conn, $p_frm_id, $p_evt_id, $p_usr_id) {
$p_rec_id = 0;
$l_rslt_msg = '';
$l_result = array(
'data' => array(),
'msg' => '0000'
);
$sql = 'BEGIN PHPEVT.EV_MOD.FETCH_EVT_VALUES(';
//$sql .= ':c_load_id,';
$sql .= ':c_frm_id,';
$sql .= ':c_evt_id,';
$sql .= ':c_rec_id,';
$sql .= ':c_usr_id,';
$sql .= ':c_rslt';
$sql .= '); END;';
if ($stmt = oci_parse($conn,$sql)) {
$l_results = oci_new_cursor($conn);
//oci_bind_by_name($stmt,':c_load_id',$p_load_id);
oci_bind_by_name($stmt,':c_frm_id',$p_frm_id);
oci_bind_by_name($stmt,':c_evt_id',$p_evt_id);
oci_bind_by_name($stmt,':c_rec_id',$p_rec_id);
oci_bind_by_name($stmt,':c_usr_id',$p_usr_id);
oci_bind_by_name($stmt,':c_rslt',$l_results,-1,OCI_B_CURSOR);
if(oci_execute($stmt)){ //Execute the prepared query.
oci_execute($l_results);
while($r = oci_fetch_array($l_results,OCI_ASSOC)) {
$l_evt_values = explode('|', $r['EVENT_VALUES']);
foreach($l_evt_values as $l_evt_value) {
list($l_ID, $l_value) = explode('#', $l_evt_value);
$l_values[] = array('ID' => $l_ID, 'VALUE' => $l_value);
}
$l_result['data'][] = array(
'LOAD_ID' => $r['LOAD_ID'],
'REC_ID' => $r['REC_ID'],
'TRAIT' => $l_values,
'G_MSG' => $r['G_MSG']
);
$l_rslt_msg = $r['G_MSG'];
}
} else {
//echo 'cannot get user';
$l_rslt_msg = '0005'; //PHP_MEMBER.FETCH_USER return error code
}
} else {
//echo 'connect fail';
$l_rslt_msg = '0006'; //Could not connect to database.
}
oci_close($conn);
echo json_encode($l_result);
}
?>
So on a webpage, when a user requests an event, a database call is made using this code to retrieve some values in the format :
"62#20000|65#15710|66#6|67#6|68#0|69#0|".
The PHP then breaks it apart by |, splits the ID#Value, puts everything into an array, then returns it as a JSON which is then parsed into a table. The latter works perfectly fine. But when this tries to fetch more than about 600 records or so, I get a 500 Internal Server Error, and I've figured it's something in this PHP that's handling the call.
I'm not convinced it's the database entirely, as a call for 3500 records with no further processing other than the JSON being returned is generally done in 5s or less.
Why would this code be failing at 500+ records? I've tried AJAX timeout of 0.
Related
I'm trying to send an array of data from PHP to ajax. I'm using echo json_encode to do it. When I do that, I try 'console.log(data)' to see the response data but it not show anything. How can I get it to display the data? I really don't know what I'm missing here. I have this script:
var scard = $('#cardid').val();
$.ajax({
type: 'GET',
url: 'cardapi.php?scard=' + scard,
success: function (data) {
console.log($.parseJSON(data));
console.log(data);
}
});
And here is my code for cardapi.php
if(isset($_GET["scard"])){
$scard = $_GET["scard"];
$data = array();
$sql = "SELECT * FROM training_record WHERE cardref_no='$scard'";
$q = sqlsrv_query($conn, $sql);
while($rw = sqlsrv_fetch_array($q, SQLSRV_FETCH_ASSOC)){
array_push($data,[
"employee_no" => $rw["employee_no"],
"dept_id" => $rw["dept_id"],
"name_th" => $rw["name_th"],
"surname_th" => $rw["surname_th"],
"signed_status" => 1,
]);
}
echo json_encode($data);
}
So I try to follow this echo json_encode() not working via ajax call
It still not show anything. Please tell me why?
Thank you.
You may try the following:
Always check the result from the sqlsrv_query() execution.
Always try to use parameterized statements. Function sqlsrv_query() does both statement preparation and statement execution, and can be used to execute parameterized queries.
Check the result from the json_encode() call.
Fix the typing errors ("signed_status" => 1, should be "signed_status" => 1 for example).
Sample script, based on your code:
<?php
if (isset($_GET["scard"])) {
$scard = $_GET["scard"];
$data = array();
$sql = "SELECT * FROM training_record WHERE cardref_no = ?";
$params = array($scard);
$q = sqlsrv_query($conn, $sql, $params);
if ($q === false) {
echo "Error (sqlsrv_query): ".print_r(sqlsrv_errors(), true);
exit;
}
while ($rw = sqlsrv_fetch_array($q, SQLSRV_FETCH_ASSOC)) {
$data[] = array(
"employee_no" => $rw["employee_no"],
"dept_id" => $rw["dept_id"],
"name_th" => $rw["name_th"],
"surname_th" => $rw["surname_th"],
"signed_status" => 1
);
}
$json = json_encode($data);
if ($json === false) {
echo json_last_error_msg();
exit;
}
echo $json;
}
?>
I'm trying to build a script where I need to read a txt file and execute some process with the lines on the file. For example, I need to check if the ID exists, if the information has updated, if yes, then update the current table, if no, then insert a new row on another temporary table to be manually checked later.
These files may contain more than 20,30 thousand lines.
When I just read the file and print some dummie content from the lines, it takes up to 40-50ms. However, when I need to connect to the database to do all those verifications, it stops before the end due to the timeout.
This is what I'm doing so far:
$handle = fopen($path, "r") or die("Couldn't get handle");
if ($handle) {
while (!feof($handle)) {
$buffer = fgets($handle, 4096);
$segment = explode('|', $buffer);
if ( strlen($segment[0]) > 6 ) {
$param = [':code' => intval($segment[0])];
$codeObj = Sql::exec("SELECT value FROM product WHERE code = :code", $param);
if ( !$codeObj ) {
$param = [
':code' => $segment[0],
':name' => $segment[1],
':value' => $segment[2],
];
Sql::exec("INSERT INTO product_tmp (code, name, value) VALUES (:code, :name, :value)", $param);
} else {
if ( $codeObj->value !== $segment[2] ) {
$param = [
':code' => $segment[0],
':value' => $segment[2],
];
Sql::exec("UPDATE product SET value = :value WHERE code = :code", $param);
}
}
}
}
fclose($handle);
}
And this is my Sql Class to connect with PDO and execute the query:
public static function exec($sql, $param = null) {
try {
$conn = new PDO('mysql:charset=utf8mb4;host= '....'); // I've just deleted the information to connect to the database (password, user, etc.)
$q = $conn->prepare($sql);
if ( isset($param) ) {
foreach ($param as $key => $value) {
$$key = $value;
$q->bindParam($key, $$key);
}
}
$q->execute();
$response = $q->fetchAll();
if ( count($response) ) return $response;
return false;
} catch(PDOException $e) {
return 'ERROR: ' . $e->getMessage();
}
}
As you can see, each query I do through Sql::exec(), is openning a new connection. I don't know if this may be the cause of such a delay on the process, because when I don't do any Sql query, the script run within ms.
Or what other part of the code may be causing this problem?
First of all, make your function like this,
to avoid multiple connects and also o get rid of useless code.
public static function getPDO() {
if (!static::$conn) {
static::$conn = new PDO('mysql:charset=utf8mb4;host= ....');
}
return static::$conn;
}
public static function exec($sql, $param = null) {
$q = static::getPDO()->prepare($sql);
$q->execute($param);
return $q;
}
then create unique index for the code field
then use a single INSERT ... ON DUPLICATE KEY UPDATE query instead of your thrree queries
you may also want to wrap your inserts in a transaction, it may speed up the inserts up to 70 times.
I usually work with web hosting companies but I decided to start learning working with servers to expand my knowledge.
I'll better give a real example to explain my question the best:
I have a web application that gathers data from a slow API that returns JSON data of products.
I have a function running every 1AM running a lot of queries on "id"s in my database.
Crontab:
0 1 * * * cd /var/www/html/tools; php index.php aso Cli_kas kas_alert
So this creates a process for the app (please correct me here if I'm wrong) and each process creates threads, and just to be more accurate, they are multi-threads since they do more than one thing: like pulling data from the DB to get the right variables and string them to the API queries, getting the data from the API, organizing it, searching the relevant data, and then inserting new data to the database.
The main PHP functions:
// MAIN: Cron Job Function
public function kas_alert() {
// 0. Deletes all the saved data from the `data` table 1 month+ ago.
// $this->kas_model->clean_old_rows();
// 1. Get 'prod' table
$data['table'] = $this->kas_model->prod_table();
// 2. Go through each row -
foreach ( $data['table'] as $row ) {
// 2.2. Gets all vars from the first query.
$last_row_query = $this->kas_model->get_last_row_of_tag($row->tag_id);
$last_row = $last_row_query[0];
$l_aaa_id = $last_row->prod_aaa_id;
$l_and_id = $last_row->prod_bbb_id;
$l_r_aaa = $last_row->dat_data1_aaa;
$l_r_and = $last_row->dat_data1_bbb;
$l_t_aaa = $last_row->dat_data2_aaa;
$l_t_and = $last_row->dat_data2_bbb;
$tagword = $last_row->tag_word;
$tag_id = $last_row->tag_id;
$country = $last_row->kay_country;
$email = $last_row->u_email;
$prod_name = $last_row->prod_name;
// For the Weekly report:
$prod_id = $last_row->prod_id;
$today = date('Y-m-d');
// 2.3. Run the tagword query again for today on each one of the tags and insert to DB.
if ( ($l_aaa_id != 0) || ( !empty($l_aaa_id) ) ) {
$aaa_data_today = $this->get_data1_aaa_by_id_and_kw($l_aaa_id, $tagword, $country);
} else{
$aaa_data_today['data1'] = 0;
$aaa_data_today['data2'] = 0;
$aaa_data_today['data3'] = 0;
}
if ( ($l_and_id != 0) || ( !empty($l_and_id) ) ) {
$bbb_data_today = $this->get_data1_bbb_by_id_and_kw($l_and_id, $tagword, $country);
} else {
$bbb_data_today['data1'] = 0;
$bbb_data_today['data2'] = 0;
$bbb_data_today['data3'] = 0;
}
// 2.4. Insert the new variables to the "data" table.
if ($this->kas_model->insert_new_tag_to_db( $tag_id, $aaa_data_today['data1'], $bbb_data_today['data1'], $aaa_data_today['data2'], $bbb_data_today['data2'], $aaa_data_today['data3'], $bbb_data_today['data3']) ){
}
// Kas Alert Outputs ($SEND is echoed in it's original function)
echo "<h1>prod Name: $prod_id</h1>";
echo "<h2>tag id: $tag_id</h2>";
var_dump($aaa_data_today);
echo "aaa old: ";
echo $l_r_aaa;
echo "<br> aaa new: ";
echo $aaa_data_today['data1'];
var_dump($bbb_data_today);
echo "<br> bbb old: ";
echo $l_r_and;
echo "<br> bbb new: ";
echo $bbb_data_today['data1'];
// 2.5. Check if there is a need to send something
$send = $this->check_if_send($l_aaa_id, $l_and_id, $l_r_aaa, $aaa_data_today['data1'], $l_r_and, $bbb_data_today['data1']);
// 2.6. If there is a trigger, send the email!
if ($send) {
$this->send_mail($l_aaa_id, $l_and_id, $aaa_data_today['data1'], $bbb_data_today['data1'], $l_r_aaa, $l_r_and, $tagword, $email, $prod_name);
}
}
}
For #Raptor, this is the function that get's the API data:
// aaa tag Query
// Gets aaa prod dataing by ID.
public function get_data_aaa_by_id_and_tg($id, $tag, $query_country){
$tag_for_url = rawurlencode($tag);
$found = FALSE;
$i = 0;
$data = array();
// Create a stream for Json. That's how the code knows what to expect to get.
$context_opts = array(
'http' => array(
'method' => "GET",
'header' => "Accepts: application/json\r\n"
));
$context = stream_context_create($context_opts);
while ($found == FALSE) {
// aaa Query
$json_query_aaa = "https://api.example.com:443/aaa/ajax/research_tag?app_id=$id&term=$tag_for_url&page_index=$i&country=$query_country&auth_token=666";
// Get the Json
$json_query_aaa = file_get_contents($json_query_aaa, false, $context);
// Turn Json to a PHP array
$json_query_aaa = json_decode($json_query_aaa, true);
// Get the data2
$data2 = $json_query_aaa['tag']['data2'];
if (is_null($data2)){ $data2 = 0; }
// Get data3
$data3 = $json_query_aaa['tag']['phone_prod']['data3'];
if (is_null($data3)){ $data3 = 0; }
// Finally, the main prod array.
$json_query_aaa = $json_query_aaa['tag']['phone_prod']['app_list'];
if ( count($json_query_aaa) > 2 ) {
for ( $j=0; $j<count($json_query_aaa); $j++ ) {
if ( $json_query_aaa[$j]['id'] == $id ) {
$found = TRUE;
$data = $json_query_aaa[$j]['data'] + 1;
break;
}
if ($found == TRUE){
break;
}
}
$i++;
} else {
$data = 0;
break;
}
}
$data['data1'] = $data;
$data['data2'] = $data2;
$data['data3'] = $data3;
return $data;
}
All threads are stacked one after an other, and when one thread is done, only then - the second thread can proceed, ect'.
And in technical view on this, all threads wait in the RAM until the one before them is done working "inside" the CPU. (correct me if I'm wrong again :] )
This doesn't even "tickle" the servers RAM or CPU when looking at it in the process manager (I use "htop"). RAM is at 400M/4.25G and CPU at ONLY 0.7%-1.3%.
Making me feel this isn't the best I can get from my current server, and getting slow results from my web app.
How do I get things done in a way that all threads work in parallel, but not to a point that my app crashes due to lacks of CPU or RAM?
I'm struggling with implementing PHP and MySQL Transactions. The script receives a SQL-statement along with some bindparameters through a blocking redisqueue. Everything is passed to a function 'do_transaction' which keeps track of the number of statements received.
I've debugged the PDO statement (after it has been processed) with PdoDebugger and the output is correct:
UPDATE bla SET processed = 1, severity_ou1 = 'low',
severity_ou2 = 'low', severity_ou3 = 'low', severity_ou4 = 'low',
severity_ou5 = 'low', saved = '1', hname = '1', sname = '1', if = '1',
v = '1', translated = 'blablabla.', filtered = 1, repeated = '1',
excessed = '1', eventfilterid = '212', building = '1', floor = '1'
WHERE id = '121614624'
global $batchcount;
$batchcount = 1;
while(true){
$redis = new Redis();
$redis->connect('xxx', xxx);
$sqlbatch = $redis->blpop('xxx:xxx:sqlfiltermatch', 0);
// blpop returns array: 0 has key, 1 has data.
if(is_array($sqlbatch)){
if(isJson($sqlbatch[1])){
$batchstatements = array();
$batchstatements[] = json_decode($sqlbatch[1], true);
// Get statement and bindparams.
$sqlstatement = $batchstatements[0]['statement'];
$bindparams = $batchstatements[0]['bindparams'];
// Replace empty bindparams.
foreach($bindparams as $column => $value){
if(is_null($value)){ $bindparams[$column] = '1'; }
if(empty($value)){ $bindparams[$column] = '1'; }
}
}
$batchcount++;
do_transaction($sqlstatement, $bindparams, $batchcount);
}
}
function do_transaction($sqlstatement, $bindparams){
global $batchcount;
if($batchcount >= 4){
try {
// Setup DB
$db = new PDO('mysql:host=xxx;dbname=xxx;charset=utf8', 'xxx', 'xxx', array(PDO::ATTR_PERSISTENT => true, PDO::ATTR_AUTOCOMMIT => FALSE, PDO::ATTR_ERRMODE => PDO::ERRMODE_WARNING));
echo $db->getAttribute(PDO::ATTR_AUTOCOMMIT)."\n\n";
$db->beginTransaction();
$stmt = $db->prepare($sqlstatement);
// Setup bindparams.
foreach($bindparams as $column => $value){
$stmt->bindParam(":$column", $value);
}
$stmt->execute() or die(print_r($stmt->errorInfo(), true));
echo PdoDebugger::show($sqlstatement, $bindparams)."\n";
$db->commit();
} catch(PDOExecption $e){
//$db->rollback();
print_r("ERROR"); exit;
}
$batchcount = 0;
}
$batchcount++;
}
I've made sure that AUTOCOMMIT = FALSE. Where in "do_transaction" does it go wrong?
There is no point in using transactions this way.
So, just leave them alone.
function do_query($db, $sqlstatement, $bindparams){
$stmt = $db->prepare($sqlstatement);
$stmt->execute($bindparams);
return $stmt;
}
is all the code you actually need.
Use it this way
$db = new PDO('mysql:host=xxx;dbname=xxx;charset=utf8', 'xxx', 'xxx',
array(PDO::ATTR_ERRMODE => PDO::ERRMODE_EXCEPTION)
);
while(true){
// whatever redis code goes here
do_query($db, $sqlstatement, $bindparams);
}
Nota bene: If you want to make your inserts faster, you should ask a question titled "How to make inserts faster", no "My transactions do not work".
But your idea of inserting transactions is wrong too.
A single transaction (in terms of the business logic) have to be written to database as soon as possible, without interfering with other transactions. Means you should never couple different business logic transactions within single database transacion. Because error in single business logic transaction will ruin whole batch. So - just write them separately.
I'm a bit new to PHP so this has got me a bit confused. I assume it's not an error within the code but rather the code logic.
There are two template files at use here:
Search - Which displays the search page.
search_item - Which is formatting for the returned results of the search.
I have dumped the execution of the query and it is returning true so this leads me to believe the query is successful and has received data from the form.
Below is the function for the Search.php page. (Displayed in the url as index.php?action=search from the controller)
public function handleAction() {
global $user, $config;
$database = Database::getDatabase();
$driver = $database->getDriver();
$search = $_POST['keywords'];
$stmt = $driver->prepare('SELECT * FROM clansoc_clans WHERE clan_name LIKE :keywords');
$stmt->bindValue(':keywords', '%' . $search . '%');
$stmt->execute();
$results = array();
while($row = $stmt->fetch(PDO::FETCH_ASSOC)) {
$results[] = array(
'id' => $row['id'],
'clan_name' => htmlspecialchars($row['clan_name'], ENT_QUOTES),
'short_desc' => htmlspecialchars($row['clan_short_desc'], ENT_QUOTES),
'clan_avatar' => $row['clan_avatar']
);
}
$results_list_registry = new ViewRegistry();
$results_list = new View('Results List', $results_list_registry, false);
$results_list_contents = "";
//$i = $offset;
foreach($results as $result) {
$results_list_result_registry = new ViewRegistry();
$this->view->getViewRegistry()->setVariable('id', $result['id']);
$this->view->getViewRegistry()->setVariable('name', $result['clan_name']);
$this->view->getViewRegistry()->setVariable('desc', $result['short_desc']);
$this->view->getViewRegistry()->setVariable('avatar', $result['clan_avatar']);
$results_list_result = new View('Results List Result', $results_list_result_registry);
$results_list_result->setView('search_item');
$results_list_contents .= $results_list_result->export(false);
}
$results_list->setView($results_list_contents);
$this->view->getViewRegistry()->setVariable('results', $results_list);
}
My question in simpler terms:
Why is the script not displaying the results of the query?