I got the following problem: When I try to increment one column of my mysql table the value get out of sync on high traffic loads.
The task is splited into two things incrementing the column and add a new row
to another table. So if I add 100 rows the counter should stay at 100.
On high loads e.g. 100 request/s it comes to problems with the counter, I got
80 on the counter value but 120 rows are added.
//This is the current increment routine build with an Yii ActiveRecord Class
$id = 123;
$dataRow = ActiveRecordModel::model()->findByPK($id);
$dataRow->counter +=1;
$dataRow->save();
//Add the row
$row = new ActiveRecordModelRow();
$row->operatingSystem = 1;
$row->save();
I think the problem is that some request are handled faster then others and maybe override the values. Hope someone can help me with pointing me in the right direction or had a suggestion how to solve this problem.
Best,
Nils
Can't you use an auto increment field in mysql so that the database handles the increment rather than the PHP. It wouldn't matter how long the processing took as the integer would be set when the record was inserted.
Related
I am creating an application that inserts (or updates) values in mysql daily. A simplified recordset with headers is :
ItemName,ItemNumber,ItemQty,Date
test1,1,5,2016/01/01
test1,1,3,2016/01/02
test2,2,7,2016/01/01
test2,2,5,2016/01/02
When using a simple insert statement for the above recordset with 16 columns and 216.000 records takes about 4 minutes (php/mysql) - This covers a week of values. Of course if I import the same recordset I get duplicates. I am trying to find a way to effectively disallow duplicate entries.
The aim is to : In the scenario where I import every day a recordset that has dates for the current week I end up with the addition of the new dates only.
The only thing that might change in consecutive imports is the ItemQty.
In php I made a logic where I query the db for ItemName,ItemNumber,Date with the values I am trying to insert. If there is a result on the SELECT statement, I break. If there isn't, I proceed inserting a new row.
Problem is that with the addition of this logic now it does not take 4 minutes, but a couple of hours. (Works though)
Any ideas?
I was thinking perhaps when I insert, to insert something like a checksum column, for example md5(ItemName,ItemNumber,ItemQty,Date) and then check this checksum rather than SELECT * FROM $table WHERE ItemName = value ,ItemNumber = value,ItemQty = value,Date = value that I currently have.
My problem is that the records I insert have nothing unique basically. Uniqueness comes from a group of fields only if compared to the dataset to be imported. If I manage somehow to get uniqueness, I'll solve my other problem too, which is deleting a row or updating a row when the ItemQty changes.
The one that you are looking for is the unique constraint. Using unique constraint, you can add all your columns to the constraint and if all columns satisfied the inserting data, it will not proceed in inserting
Few options:
1) On PHP, iterate over the records, mapping the duplicate ones and keeping the newests
$itemsArray = []; // The array where you have stored your data
$uniqueItems = [];
foreach($itemsArray as $item)
{
if(isset($uniqueItems[$item['ItemName']]))
{
$oldRecord = $uniqueItems[$item['ItemName']];
$newTimeStamp = strtotime($item['Date']); // Might not work with your format date
$currentTimeStamp = strtotiem($oldRecord['Date']);
if($newTimeStamp > $currentTimeStamp)
{
$uniqueItems[$item['ItemName']] = $item;
}
}
else
{
$uniqueItems[$item['ItemName']] = $item;
}
}
// uniqueItems now hold only 1 record per ItemName (the newest one)
2) Sort the data in php by date on ascending order(before inserting in database). Then, on your clause, use ON DUPLICATE KEY UPDATE. This will cause mysql to update the records with duplicate key. In this case, the older records will be inserted first, so the lastest records will be inserted last, overwritting the old records data.
I am creating a job number system that a few users will be using at the same time. I have created a job number on the php page and then it saves the number to the job sheet and uses this to link other tables to the job.
I take the job number from a table called numbers which then should increment the number by 1 each time the job is submitted ready to create the next job.
But the numbers are not working correctly.
As an example I get 1,2,3,4,8, then 43,44,45,then 105
I cant see why they would jump so much
$job_number_query = "SELECT * FROM numbers";
$job_result =($mysqli-> query($job_number_query)) ;
$job_num = mysqli_fetch_assoc($job_result);
$increment_job_number = $job_num[job_number];
$update_job_number_query = "UPDATE numbers SET job_number = $increment_job_number +1 ";
$mysqli-> query($update_job_number_query);
//echo ($customer_id);
Then I simply insert the $increment_job_number into the jobsheet table.
I am using int for the Job_number field in the table numbers
I cant think of a way to test the numbers. I guess a way is to look through the jobsheets and add another number to there but because more than one user might have a job that hasn't been submitted yet would this also cause problems.
Just increase the value without the first SELECT:
UPDATE numbers SET job_number = job_number +1
You have no where clause on your update query, so you're incrementing the job_number field in ALL records in the table.
It was me that was the technical failure in the end. I have got the incremental numbers on the create page but then unfortunately I had also got the incremental number on the edit pages so every time I edited the pages I then added 1 to the number field in the numbers table.
I have a database named creative_db Table name is store_weblinks. Now inside this table there are several columns that holds the entire site's weblinks. My focus is on this column - weblinks_status
Now weblinks_status contains 2 values = waiting and live
So here's what I intend to do - Update any 10 waiting to live status.
I think I need a loop of some kind that would keep a count of how many successful edits are taking place. Once that hits 10 it will stop processing..
So, its like this -- Check if the current item of weblinks_status is waiting. If it is waiting then change it to live and increment the loop counter else proceed to the next waiting item..
Need your help!
Try this STATEMENT:
UPDATE store_weblinks
SET weblinks_status="live"
WHERE weblinks_status="waiting"
LIMIT 10;
This should work to update any 10 entries from waiting to live.
I'm trying to retrieve data from sql table A, modify some columns then insert the modified columns into sql table B.
However my issue is that when I use:
$customer= new Customer;
$fakecustomer= new Fakecustomer;
$fake_customer_name_records = $fakecustomer->get()->toArray();
foreach($fake_customer_name_records as $record){
//process columns for each record
$fake_customer_name_records_arry[]=array(
'last_name'=>$last_name,
'first_name'=>$first_name,
'home_phone'=>$phonenumber,
);
}
$customer->insert($fake_customer_name_records_arry);
It can only insert around 1000 records. Is there a way in Laravel for me to process about 60,000 records?
Thanks
I would suggest to use the "chunk" option here, and process records in "chunks". It's more native way, to my opinion. Here's what docs say :
Chunking Results
If you need to process a lot (thousands) of Eloquent records, using
the chunk command will allow you to do without eating all of your RAM:
User::chunk(200, function($users)
{
foreach ($users as $user)
{
//
}
});
The first argument passed to the method is the number of records you
wish to receive per "chunk". The Closure passed as the second argument
will be called for each chunk that is pulled from the database.
Link to read more : click
Use an extra variable and sum 1 every iteration, when reaches 1000 (or lower) execute 'insert' and reset the counter.
Have you tried disabling the query log with DB::disableQueryLog(); ? I had the same problem and this prety much solved it.
Also, when working with migrations or some process that's going to take a lot of time try to create a command instead of trying to do it with a controller.
I'm using PHP and PHPMyAdmin to create a small profile site.
I'm giving members an ID number, based on which is the biggest number currently in the database, +1
I did 25 tests before I got the PHP script where I wanted it to be.
I then deleted those 25 entries using PHPMyAdmin.
But now, when my PHP code does this:
function getLatestID() {
$query = "SELECT max(member_id) FROM members";
$result = #mysql_query($query) or showError("unable to query database for user information");
if (!($record = mysql_fetch_array($result))) return null;
return $record[0];
}
I get the wrong number.
Test scenario: the database table holds 3 entries, with ID's 1, 2 and 3.
I start a debugging session and put a breakpoint on the return $record[0].
I check its contents and instead of 3, which is the biggest number, it's 28.
As in 25+3=28, the 25 entries that I allready deleted...
Does anybody know what's causing this and how I can fix it?
It's probably because you have auto_increment set and the query is returning the highest id. When you deleted the other records, you probably didn't reset the auto increment count.
If you're using auto_increment in MySQL then deleting records won't decrease the next value.
You can empty a table with TRUNCATE TABLE mytable - this will reset the value.
You can also change value that auto-increment thinks is the next value to allocate:
ALTER TABLE members AUTO_INCREMENT = 3;
Note that if you put in a value that is less than the current max value in the auto-increment column, it'll change the value to that MAX+1. To see what the current next value is set to, do this:
SHOW CREATE TABLE members;
At the end of the table definition, it'll show "AUTO_INCREMENT = 26" or whatever it's current value is.