Can you sum or add in a variable while inserting data? - php

I would like to ask if it is possible to add together with a variable, I've manage it to echo it and it worked.
But is it possible in a insert statement?
Here is my code!
<?php
include('admin/db/database_configuration.php');
$sql = "SELECT * FROM tblalbums";
$result = $conn->query($sql);
if ($result->num_rows > 0) {
// output data of each row
$row = $result->fetch_assoc();
$wow = $row['album_id'];
echo ($row['album_id']+1);
} else {
echo "0 results";
}
$conn->close();
?>
<?php
include('admin/db/database_configuration.php');
if (isset($_FILES["userfile"]) && !empty($_FILES["userfile"])) {
$image = $_FILES['userfile']['tmp_name'];
$imageSize = $_FILES['userfile']['size'];
$imageType = $_FILES['userfile']['type'];
// $job_desc = $_POST['desc'];
$imageName = $_FILES['userfile']['name'];
$len = count($image);
$path = "jobs/";
for ($i = 0; $i < $len; $i++) {
if (isset($imageName[$i]) && $imageName[$i] !== NULL) {
if(move_uploaded_file($image[$i], $path.$imageName[$i])) {
// $result = $conn->query("INSERT INTO tbljoba ( job_img_size, job_img_type, job_desc, job_img) VALUES ('$imageSize[$i]', '$imageType[$i]', '$job_desc', '$imageName[$i]')");
$sql = "INSERT INTO `tblphotos` ( imageName, imageSize, imageType, album_id) VALUES ('$imageName[$i]','$imageSize[$i]', '$imageType[$i]', '$wow');";
// echo"<script>alert('Image upload successfully!');location.href='employer_contact_us.php';</script>";
if ($conn->query($sql) === TRUE) {
echo "New records created successfully";
} else {
echo "Error: " . $sql . "<br>" . $conn->error;
}
}
}
}
}
$conn->close();
?>
As you can see above the variable $wow is album_id in the database, and i've echo it so the last number in the database + 1.
So i've got the INSERT statement and you can see there I've put the variable $wow. So how can add or sum up + 1 in adding values?
Thank you!

remove single quoted lines ' ' from $wow in VALUES then +1
$wow=1 without single quoted lines so that it will not read as a string to the database

You can find and read information about AUTO_INCREMENT for fields in MySQL
May be this can help you.

Related

how to genrate uniq id if we send the data using seprate comma

i m try to use uniqid() to genrate the uniqname of the file and send it to the sql db by spliting seprate comma like('pic.jpg','pic1.jpg','pic2.jpg')
in place of pic i need a uniq name each and every time at the time of uploading of the file,i store all uploaded file into a folder and send the path of the image to the sql
<?php
if (isset($_POST['btnSubmit'])) {
$rep=$_FILES['files']['name'];
for ($i = 0; $i < count($rep); $i++) {
if ($_FILES["files"]["size"][$i] < 1000000) { // Check File size (Allow 1MB)
$nam=$_FILES["files"]["name"][$i];
// $nm = $_FILES["files"]["name"];
$album_cat = "";
$l = 0;
foreach ($rep as $album_cat1) {
$album_cat .= $album_cat1 . ",";
$l++;
}
$_POST['$album_cat'] = $album_cat;
$_POST['$album_cat_count'] = $l;
$temp = $_FILES["files"]["tmp_name"][$i];
$name = pathinfo($nam);
$profile = "group_images/" . uniqid() . '.' . $name['extension'];
if (empty($temp)) {
break;
}
if ($i == 0) {
$err = "File uploaded successfully";
$cls = "success";
}
$groupalbum = "UPDATE group_master SET group_photo='".$_POST['$album_cat']."' WHERE group_id='4'";
//$groupalbum = "UPDATE group_master SET group_photo='$profile' WHERE group_id='4'";
if ($conn->query($groupalbum) === TRUE) {
} else
echo "Error updating record: " . $conn->error;
move_uploaded_file($temp, $profile);
}
else {
$err = "File size is more than 1MB";
$cls = "danger";
}
}
}
?>
When I need an uniqid for a set of elements, I usually call uniqid() only once and after that I use an index (just for optimizing the speed of the script). So, you can call uniqid() before the for declaration:
$rep=$_FILES['files']['name'];
$uniqid = uniqid();
for ($i = 0; $i < count($rep); $i++) {
and then use $i as a suffix for your files
$profile = "group_images/".$uniqid.'-'.$i.'.'.$name['extension'];
On the other hand, you are calling move_uploaded_file($temp, $profile); only if your sql fails.
if ($conn->query($groupalbum) === TRUE) {
} else
echo "Error updating record: " . $conn->error;
move_uploaded_file($temp, $profile);
}
Are you sure that the logic is correct? Don't you need something like:
if ($conn->query($groupalbum) === TRUE) {
if (move_uploaded_file($temp, $profile)) {
// file has been uploaded successfully
} else {
// error in file upload process
}
} else
echo "Error updating record: " . $conn->error;
}
And the final point, your code is open for sql injection
$groupalbum = "UPDATE group_master SET group_photo='".$_POST['$album_cat']."' WHERE group_id='4'";
You should not use $_POST values directly into sql statements.

How to get last inserted id in mysql and use it to insert it in another table?

I'm trying to get last id which i inserted already in database through another php file and in current php file i can't get last id to use it.
This is my code.
<?php
include "connection.php.php";
$sucess = 105;
$fail = 107;
$planImage = $_POST['projects_plan_images'];
$planFilename = $_POST['projectsPlanImagesNames'];
$last_id = mysqli_insert_id($con);
$planLength = count($planFilename);
for($i = 0; $i < $planLength; $i++) {
$imageNow = time();
$new_imageFilename = $imageNow . "_" . $planFilename[$i];
$sql6 = "INSERT INTO projects_plans_images(project_id, plan_image, plan_name) VALUES ('$last_id','$new_imageFilename','$new_imageFilename')";
$result6 = mysqli_query($con, $sql6);
$binary=base64_decode($planImage[$i]);
$file = fopen('files/plans_images/' . $new_imageFilename, 'wb');
fwrite($file, $binary);
fclose($file);
}
$jsonResult = '{"state":';
if($result){
$jsonResult.= $sucess;
}else {
$jsonResult.= $fail;
}
$jsonResult .="}";
print_r($jsonResult);
?>

How to use single for each loop for multiple array to update single sql statement

I want to use single for each for two array.
array 1 : has path of images (path is dynamic as per user need)
array 2: has values description field of each image (description is dynamic as per user need)
I want to insert both array in sql table using only one sql statement.
<?php $sql="INSERT INTO posts(title,description,category,createdBy,pictureURL,CreatedAt) VALUE ('$title',
'$description','$category','$creatdby','Admin/PostImages/$finalpath',now());";
$result = mysqli_query($conn,$sql);
if($result)
{
$upload_directory = 'PostImages/';
$field_values_array = $_REQUEST['desc'];
$x=0;
foreach($field_values_array as $value1){
foreach ( $_FILES['photo']['name'] AS $key => $value ){
//Move file to server directory
if(move_uploaded_file($_FILES["photo"]["tmp_name"][$x], $upload_directory . $_FILES["photo"]["name"][$x])){
$finalpath=$upload_directory . $_FILES["photo"]["name"][$x];
}
if (isset($_SESSION['p_id'])){
$p_id = $_SESSION["p_id"];
}
$sql1="INSERT INTO `postimages`(`p_id`,`description`, `img_path`) VALUES ('$p_id','$value1','$finalpath')";
$result1 = mysqli_query($conn,$sql1);
$x++;
}
}
header("Location: uploadpost_test.php");
}
else
{
echo "Error: " . $sql . "<br>" . $conn->error;
} ?>
I refactor and clean up your codes. Try this.
if(isset($_SESSION['p_id']))
{
$p_id = $_SESSION["p_id"];
}
$values = '';
$next = ',';
$len = count($field_values_array);
foreach($field_values_array as $x=>$value1)
{
if(move_uploaded_file($_FILES["photo"]["tmp_name"][$x], $upload_directory.$_FILES["photo"]["tmp_name"][$x]))
{
$tmp_name = $_FILES["photo"]["tmp_name"][$x];
$finalpath = $upload_directory.$tmp_name;
$next = $x >= $len-1 ? '' : $next;
$values .= "VALUES('$p_id','$value1','$finalpath')".$next;
}
}
$sql1="INSERT INTO `postimages`(`p_id`,`description`, `img_path`) $values";
$result1 = mysqli_query($conn,$sql1);

PHP Mysqli insert into database with large iterations

Below I have Php code that loops through an array and for each it checks if the value already exists in the database and if not, create it. The code itself is working but the loop itself can be insanely big, maximum of a couple tens thousand iterations.
How can I optimize this code? What to use and how to use. There should be a better way to insert this many times without looping through each individual.
foreach($arr as $value){
$checkID = mysqli_query($cenn, "SELECT item_id from items WHERE item_id = '$value'");
if (!$checkID) {
die("Query '$checkID' failed to execute for some reason");
}else{
if (mysqli_num_rows($checkID) > 0) {
$user = mysqli_fetch_array($checkID);
echo "item_id" . checkID . "exists already";
}
else{
echo "item_id: '$user_id' doesn't exist<br>";
$gw2Api = file_get_contents("https://api.guildwars2.com/v2/items/" . $user_id); //12452 30704
$gw2Api_result = json_decode($gw2Api,true);
/*Here would be some code to determine values that are being inserted*/
if (!array_key_exists("description",$gw2Api_result)) {
$description = 'No description available...';
} else{
if($gw2Api_result['description'] === ''){
$description = "No description available...";
} else {
$description = $gw2Api_result['description'];
}
}
$insertItem = "INSERT INTO items
(item_id, name, description,
AccountBindOnUse, AccountBound,
last_update
)
VALUES ('$user_id', '$gw2Api_result[name]', '$description',
'$AccountBindOnUse', '$AccountBound', CURRENT_TIMESTAMP)";
if ($cenn->query($insertItem) === true) {
echo "New record '$user_id' created successfully";
} else {
echo "Error: " . $sql . "<br>" . $cenn->error;
}
}
}
} // end foreach
The question: How to insert many values, new rows, into mysqli database as fast as possible.
Just use bulk insert.
Collect all the rows for insertion and pass it in one query.
echo 'hi';
if (!empty($arr)) {
echo 'ok';
$values = "'" . implode("', '", $arr) . "'";
$qExistingItemIds = mysqli_query($cenn, "SELECT item_id from items WHERE item_id IN($values)");
$existingItemIds = [];
while ($existingItemId = mysqli_fetch_array($qExistingItemIds)) {
$existingItemIds[] = $existingItemId['item_id'];
}
$arr = array_diff($arr, $existingItemIds);
$inserts = array();
$i = 0;
$ic = count($arr);
foreach ($arr as $value) {
$i++;
echo "item_id: $value doesn't exist<br>";
$gw2Api = file_get_contents("https://api.guildwars2.com/v2/items/" . $value); //12452 30704
$gw2Api_result = json_decode($gw2Api,true);
/*Here would be some code to determine values that are being inserted*/
if (!array_key_exists("description", $gw2Api_result)) {
$description = 'No description available...';
} else {
if ($gw2Api_result['description'] === '') {
$description = "No description available...";
} else {
$description = $gw2Api_result['description'];
}
}
$inserts[] = "
('$value', '$gw2Api_result[name]', '$description', '$AccountBindOnUse', '$AccountBound', CURRENT_TIMESTAMP)
";
if ($i == 50 OR $i == $ic) {
$inserts = implode(",", $inserts);
$insert = "
INSERT INTO items
(item_id, name, description, AccountBindOnUse, AccountBound, last_update)
VALUES
$inserts
";
if ($cenn->query($insert) === true) {
echo 'great';
echo "New records created successfully";
} else {
echo "Error: " . $sql . "<br>" . $cenn->error;
}
$ic -= 50;
$i = 0;
$inserts = array();
}
}
}
so now we have only 2 queries. not thousands
details about bulk insert:
http://www.geeksengine.com/database/data-manipulation/bulk-insert.php
If you use prepared statement you should reduce the round trips to the database server and only compile and optimise each query once instead of Number_of_inputs * 2 queries. This should reduce the workload.
I would be very interested to know by how much.
$sql = "SELECT item_id from items WHERE item_id = ?";
$db_select = $cenn->prepare($sql);
if ( ! $db_select ) {
echo $cenn->error;
exit;
}
$sql_insert = "INSERT INTO items
(item_id, name, description,
AccountBindOnUse, AccountBound, last_update)
VALUES (?, ?, ?, ?, ?, CURRENT_TIMESTAMP)";
$db_insert = $cenn->prepare($sql);
if ( ! $db_insert ) {
echo $cenn->error;
exit;
}
foreach($arr as $value){
$db_select->bind_param('i', $value);
$res = $db_select->execute()
if ( $res === FALSE ) {
echo $cenn->error;
exit;
}
if ($db_select->num_rows > 0) {
// dont bother fetching data we already know all we need to
$user = $db_select->free();
echo "item_id $value exists already";
} else {
echo "item_id: $value doesn't exist<br>";
$gw2Api = file_get_contents("https://api.guildwars2.com/v2/items/" . $value);
$gw2Api_result = json_decode($gw2Api,true);
if ( ! array_key_exists("description",$gw2Api_result)
|| $gw2Api_result['description'] === '') {
$description = 'No description available...';
} else{
$description = $gw2Api_result['description'];
}
$db_insert->bind_param('issss', $value, $gw2Api_result[name],
$description, $AccountBindOnUse,
$AccountBound)
if ($cenn->query($insertItem) === true) {
echo "New record $value' created successfully";
} else {
echo "Error: " . $sql_insert . "<br>" . $cenn->error;
}
}
} // end foreach

Build a batch query for MySQL insert each 1000 items

I need to perform a batch insert in MySQL/MariaDB but since data is dynamic I need to build the proper SQL query. In a few steps:
I should find whether the current row exists or not in table - this is the first SELECT inside the loop
Right now I have 1454 but have to insert around 150k later, is better a batch query than 150k INSERT per item on the loop
If record already exists I should update it if doesn't then I should insert ,I just not care about UPDATE yet and the code you're seeing is only for INSERT
So here is what I am doing:
// Get values from Csv file as an array of values
$data = convertCsvToArray($fileName);
echo "DEBUG count(data): ", count($data), "\n";
$i = 0;
$sqlInsert = "INSERT INTO reps(veeva_rep_id,first,last,email,username,lastLoginAt,lastSyncAt,display_name,rep_type,avatar_url,createdAt,updatedAt) ";
// Processing on each row of data
foreach ($data as $row) {
$sql = "SELECT id,lastSyncAt FROM reps WHERE veeva_rep_id='{$row['Id']}'";
echo "DEBUG: ", $sql, "\n";
$rs = $conn->query($sql);
if ($rs === false) {
echo 'Wrong SQL: '.$sql.' Error: '.$conn->error, E_USER_ERROR;
} else {
$rows_returned = $rs->num_rows;
$veeva_rep_id = "'".$conn->real_escape_string($row['Id'])."'";
$first = "'".$conn->real_escape_string(ucfirst(strtolower($row['FirstName'])))."'";
$last = "'".$conn->real_escape_string(ucfirst(strtolower($row['LastName'])))."'";
$email = "'".$conn->real_escape_string($row['Email'])."'";
$username = "'".$conn->real_escape_string($row['Username'])."'";
$display_name = "'".$conn->real_escape_string(
ucfirst(strtolower($row['FirstName'])).' '.ucfirst(strtolower($row['LastName']))
)."'";
// VALUES should be added only if row doesn't exists
if ($rows_returned === 0) {
// VALUES should be append until they reach 1000
while ($i % 1000 !== 0) {
$sqlInsert .= "VALUES($veeva_rep_id,$first,$last,$email,$username,NOW(),NOW(),$display_name,'VEEVA','https://pdone.s3.amazonaws.com/avatar/default_avatar.png',NOW(),NOW())";
++$i;;
}
// QUERY should be output to console to see if it's right or something is wrong
echo "DEBUG: ", $sqlInsert, "\n";
// QUERY should be executed if there are 1000 VALUES ready to add as a batch
/*$rs = $conn->query($sqlInsert);
if ($rs === false) {
echo 'Wrong SQL: '.$sqlInsert.' Error: '.$conn->error, E_USER_ERROR;*/
}
} else {
// UPDATE
echo "UPDATE";
}
}
}
But this line of code: echo "DEBUG: ", $sql, "\n"; is not outputting nothing to console. I must be doing something wrong but I can't find what. Can any help me to build the proper batch query and to execute it each 1000 values append?
Proper output should be:
DEBUG count(data): 1454
DEBUG: SELECT id,lastSyncAt FROM reps WHERE veeva_rep_id='00580000008ReolAAC'
DEBUG: SELECT id,lastSyncAt FROM reps WHERE veeva_rep_id='005800000039SIWAA2'
....
DEBUG: INSERT INTO reps(veeva_rep_id,first,last,email,username,lastLoginAt,lastSyncAt,display_name,rep_type,avatar_url,createdAt,updatedAt) VALUES(...), VALUES(...), VALUES(...)
Obtained result:
DEBUG count(data): 1454
DEBUG: SELECT id,lastSyncAt FROM reps WHERE veeva_rep_id='00580000008RGg6AAG'
DEBUG: INSERT INTO reps(veeva_rep_id,first,last,email,username,lastLoginAt,lastSyncAt,display_name,rep_type,avatar_url,createdAt,updatedAt)
DEBUG: SELECT id,lastSyncAt FROM reps WHERE veeva_rep_id='00580000008RQ4CAAW'
DEBUG: INSERT INTO reps(veeva_rep_id,first,last,email,username,lastLoginAt,lastSyncAt,display_name,rep_type,avatar_url,createdAt,updatedAt)
.... // until reach 1454 results
The table is empty so it should never goes through ELSE condition (UPDATE one).
EDIT
With help from the answer this is how the code looks now:
$data = convertCsvToArray($fileName);
echo "DEBUG count(data): ", count($data), "\n";
$i = 1;
$sqlInsert = "INSERT INTO reps(veeva_rep_id,first,last,email,username,lastLoginAt,lastSyncAt,display_name,rep_type,avatar_url,createdAt,updatedAt) VALUES";
foreach ($data as $row) {
$sql = "SELECT id,lastSyncAt FROM reps WHERE veeva_rep_id='{$row['Id']}'";
$rs = $conn->query($sql);
if ($rs === false) {
echo 'Wrong SQL: '.$sql.' Error: '.$conn->error, E_USER_ERROR;
} else {
$rows_returned = $rs->num_rows;
$veeva_rep_id = "'".$conn->real_escape_string($row['Id'])."'";
$first = "'".$conn->real_escape_string(ucfirst(strtolower($row['FirstName'])))."'";
$last = "'".$conn->real_escape_string(ucfirst(strtolower($row['LastName'])))."'";
$email = "'".$conn->real_escape_string($row['Email'])."'";
$username = "'".$conn->real_escape_string($row['Username'])."'";
$display_name = "'".$conn->real_escape_string(
ucfirst(strtolower($row['FirstName'])).' '.ucfirst(strtolower($row['LastName']))
)."'";
if ($rows_returned === 0) {
if ($i % 1000 === 0) {
file_put_contents("output.log", $sqlInsert."\n", FILE_APPEND);
$sqlInsert = "INSERT INTO reps(veeva_rep_id,first,last,email,username,lastLoginAt,lastSyncAt,display_name,rep_type,avatar_url,createdAt,updatedAt) VALUES";
} else {
$sqlInsert .= "($veeva_rep_id,$first,$last,$email,$username,NOW(),NOW(),$display_name,'VEEVA','https://pdone.s3.amazonaws.com/avatar/default_avatar.png',NOW(),NOW()), ";
}
$i++;
} else {
echo "UPDATE";
}
}
}
But still buggy because:
I have got a first empty INSERT query: INSERT INTO reps(veeva_rep_id,first,last,email,username,lastLoginAt,lastSyncAt,display_name,rep_type,avatar_url,createdAt,updatedAt) VALUES
I have got a second INSERT query with 1000 VALUES() append, but what happened with the rest? The remaining 454?
Can any give me another tip? Help?
Since it looks like you are trying to load data from a CSV file, you might want to consider using LOAD DATA INFILE functionality which is designed specifically for this purpose.
Here is link to documentation: https://dev.mysql.com/doc/refman/5.6/en/load-data.html
consider using INSERT IGNORE INTO table to check if the record already exists. How to 'insert if not exists' in MySQL?
if you haven't already done so, make veeva_rep_id a PRIMARY key so the INSERT IGNORE will work
also check out using PDO for transactions, prepared statements and dynamically generating queries using PDO
PDO Prepared Inserts multiple rows in single query
<?php
$sql = 'INSERT IGNORE INTO reps(veeva_rep_id,first,last,email,username,lastLoginAt,lastSyncAt,display_name,rep_type,avatar_url,createdAt,updatedAt) VALUES ';
$insertQuery = array();
$insertData = array();
/*
assuming the array from the csv is like this
$data = array(
0 => array('name' => 'Robert', 'value' => 'some value'),
1 => array('name' => 'Louise', 'value' => 'another value')
);
*/
foreach ($data as $row) {
$insertQuery[] = '(:veeva_rep_id' . $n . ', :first' . $n . ', :last' . $n . ', :email' . $n . ', :username' . $n . ', :lastLoginAt' . $n . ', :lastSyncAt' . $n . ', :display_name' . $n . ', :rep_type' . $n . ', :avatar_url' . $n . ', :createdAt' . $n . ', :updatedAt' . $n . ')';
$insertData['veeva_rep_id' . $n] = $row['name'];
$insertData['first' . $n] = $row['value'];
$insertData['last' . $n] = $row['name'];
$insertData['email' . $n] = $row['value'];
$insertData['username' . $n] = $row['name'];
$insertData['lastLoginAt' . $n] = $row['value'];
$insertData['lastSyncAt' . $n] = $row['value'];
$insertData['display_name' . $n] = $row['name'];
$insertData['rep_type' . $n] = $row['value'];
$insertData['avatar_url' . $n] = $row['value'];
$insertData['createdAt' . $n] = $row['name'];
$insertData['updatedAt' . $n] = $row['value'];
$n++;
}
$db->beginTransaction();
if (!empty($insertQuery) and count($insertQuery)>1000) {
$sql .= implode(', ', $insertQuery);
$stmt = $db->prepare($sql);
$stmt->execute($insertData);
}
$db->commit();
print $sql . PHP_EOL;
let me know if it helps.
You should have something like:
// Try fetching data from table 1
// If there is no record available, then fetch some data from table 2
// and insert that data inito table 1
You just wrote
$sql = "INSERT INTO reps(veeva_rep_id,first,last,email,username,lastLoginAt,lastSyncAt,display_name,rep_type,avatar_url,createdAt,updatedAt) ";
// Processing on each row of data
foreach ($data as $row) {
But from an insert no data is selected and second...you didn't run a select, where comes $data from?
update Use if ($i % 1000 === 0) { instead of while ($i % 1000 !== 0) {
$i = 0;
$sqlInsert = "INSERT INTO reps(veeva_rep_id,first,last,email,...) ";
// Processing on each row of data
foreach ($data as $row) {
$sql = "SELECT id,lastSyncAt FROM reps WHERE veeva_rep_id='{$row['Id']}'";
echo "DEBUG: ", $sql, "\n";
$rs = $conn->query($sql);
if ($rs === false) {
echo 'Wrong SQL: '.$sql.' Error: '.$conn->error, E_USER_ERROR;
} else {
$veeva_rep_id = ...;
$first = ...;
$last = ...;
$email = ...;
// ...
// VALUES should be added only if row doesn't exists
if($rs->num_rows == 0) {
// Insert some data
$i++;
if ($i % 1000 === 0) {
echo "DEBUG: ", $sqlInsert, "\n";
// execSql($sqlInsert);
$sqlInsert = "INSERT INTO reps(veeva_rep_id,first,last,email,...) "; // reset
} else {
$sqlInsert .= "VALUES($veeva_rep_id,$first,$last,$email,...) ";
}
} else {
echo "UPDATE";
}
}
}

Categories