Here is the problem,
I have a PHP file called at every second for 20 seconds. If I write a query to update a table that will be executed 20 times. But I want it to be updated only once of 20 calls. After 20 seconds the table need to be restored for further updates. How can I accomplish this? Is it anything like trigger to automatically prevent it for certain period of time?
I have tried something so far,
I kept a record in a table updating current timestamp, I'm checking the timestamp for the next call, If it exceeds 20 seconds, i'm updating it, else just passing the updating script. It will work, but any more efficient methods?
The fun and interesting method where by on PHP 5.3 you can use APC cache to store a variable for 20 seconds and given that it does not exist run your query. This would change in php 5.5 with PHPs adoption of a different caching method.
if(!$value = apc_fetch('key')) {
// Run your query and store the updated key
apc_store('key', true, 20);
}
The boring and dull method but solidly future proof is to use a session variable to effectively do the same and just check that its within the 20 second limit.
if(strtotime($_SESSION['timer']) > strtotime("-20 seconds")) {
// run your query and update the timer with the update time.
$_SESSION['timer'] = date();
}
Related
I have a cron job that runs once every hour, to update a local database with hourly data from an API.
The database stores hourly data in rows, and the API returns 24 points of data, representing the past 24 hours.
Sometimes a data point is missed, so when I get the data back, I cant only update the latest hour - I also need to check if I have had this data previously, and fill in any gaps where gaps are found.
Everything is running and working, but the cron job takes at least 30 minutes to complete every time, and I wonder if there is any way to make this run better / faster / more efficiently?
My code does the following: (summary code for brevity!)
// loop through the 24 data points returned
for($i=0; $i<24; $i+=1) {
// check if the data is for today, because the past 24 hours data will include data from yesterday
if ($thisDate == $todaysDate) {
// check if data for this id and this time already exists
$query1 = "SELECT reference FROM mydatabase WHERE ((id='$id') AND (hour='$thisTime'))";
// if it doesnt exist, insert it
if ($datafound==0) {
$query2 = "INSERT INTO mydatabase (id,hour,data_01) VALUES ('$id','$thisTime','$thisData')";
}
}
}
And there are 1500 different IDs, so it does this 1500 times!
Is there any way I can speed up or optimise this code so it runs faster and more efficiently?
This does not seem very complex and it should run in few seconds. So my first guess without knowing your database is that you are missing an index on your database. So please check if there is an index on your id field. If your id field is not your unique key you should consider adding another index on 2 fields id and hour. If these aren't already there this should lead to a massive time save.
Another idea could be to retrieve all data for the last 24 hours in a single sql query, store the values in an array and do your checks if you already read that data only on your array.
So I'm using WampServer with the default phpMyAdmin to store this SQL table called Typing.
Table: Typing
Now I want to set the typing column to 0 for any row that has set the typing column to 1 more than five seconds ago.
For ex. I just set the typing column to 1 for the first row and my database detects the time since this 1 has been written, then it sets a 5 second timer to revert that 1 back to a 0. If 1 is overwritten with another 1 during that time, that timer should rest.
How should I go about this? Should I have a column for a 'timestamp' of each record? How do I make my database constantly check for entries older than 5 seconds without user input? Do I need an always on PHP script or a database trigger and how would I go about that?
As #JimL suggested, it might be a bit too ambitious to purge the records after only five seconds.
It might be helpful to have more information about what you're trying to accomplish, but I'll answer in a generic way that should answer your question
How I would handle this is that any queries should check for records that are less than five seconds old (I assume you're querying the data and only want records that are less than five seconds, otherwise I'm not really following the point of your question).
Once a day, or hourly if you have that much data, you can run a scheduled job (scheduled through MySQL itself, not through cron/Windows Scheduled Tasks) to purge the old records. You can use phpMyAdmin to set that up (the "Events" tab), although it's actually a MySQL feature that doesn't require phpMyAdmin.
I got it, I added a timestamp to each record and used this code:
mysqli_query($con,"DELETE FROM 'typing' WHERE TIMESTAMPDIFF(SECOND,recordDate, CURRENT_TIMESTAMP) > 1");
It's not a chron job though so it only runs if there is someone accessing the site, but it's good enough for what I need. Thanks for the help everyone :)
I have a script that runs via CRON that processes each row (or user) in one of the tables in my databases, then uses cURL to pull a URL based on the username found in the row, and then adds or updates additional information into the same row. This works fine for the most part, but seems to take about 20 minutes+ to go through the whole database and it seems to go slower and slower the farther it is into the while loop. I have about 4000 rows at the moment and there will be even more in the future.
Right now a simplified version of my code is like this:
$i=0;
while ($i < $rows) {
$username = mysql_result($query,$i,"username");
curl_setopt($ch, CURLOPT_URL, 'http://www.test.com/'.$username.'.php');
$page = curl_exec($ch);
preg_match_all('htmlcode',$page,$test)
foreach ($test as $test3) {
$test2 = $test[$test3][0];
}
mysql_query("UPDATE user SET info = '$test2' WHERE username = '$username');
++$i;
}
I know MySQL querys shouldn't be in a while loop, and it's the last query for me to remove from it, but what is the best way to handle a while loop that needs to run over and over for a very long time?
I was thinking the best option would be to have the script run through the rows ten at a time then stop. For instance, since I have the script in CRON, I would like to have it run every 5 minutes and it would run through 10 rows, stop, and then somehow know to pick up the next 10 rows when the CRON job starts again. I have no idea how to accomplish this however.
Any help would be appreciated!
About loading the data step by step:
You could add a column "last_updated" to your table and update it every time you load the page. Then you compare the column with the current timestamp before you load the website again.
Example:
mysql_query("UPDATE user SET info = '$test2', last_updated = ".time()." WHERE username = '$username');
And when you load your data, make it "WHERE last_updated > (time()-$time_since_last_update)"
What about dropping the 'foreach' loop?
Just use the last element of the $test array.
LIMIT and OFFSET are your friends here. Keep track of where you are through a DB field as suggested by Bastian or you could even store the last offset you used somewhere (could be a flat file) and then increase that every time you run the script. When you don't get any more data back, reset it to 0.
I have a application that unfortunately uses legacy mysql_* functions with MyISAM tables (bleagh...), so I cannot use transactions. I have code that gets a current balance, checks whether this balance is okay, if yes, it will subtract a quantity and save the new balance.
The problem is, I have recently seen an instance where two queries grab the same starting balance, subtract a quantity, then record a new balance. Since they both grabbed the same starting balance, the ending balance after both UPDATES is wrong.
100 - 10 = 90
100 - 15 = 85
When it should be...
100 - 10 = 90
90 - 15 = 75
These requests executed a several minutes apart, so I do not believe the discrepancy is due to race conditions. My initial though is that MySQL cache is storing the result of the identical initial query that gets the balance. I read however that this type of cache is deleted if any relevant tables are modified.
I will most likely fix by putting everything into one query, but I still would like to figure this out. It mystifies me. If cache is deleted when a table is modified, then what happened shouldn't have happened. Has anyone heard of something like this, or have any ideas as to why it may have happened?
It's highly unlikely to be a query cache - MySQL is smart enough to invalidate a cache entry if the underlying data set has been modified by another query. If the query cache kept around old stale values long past their expiration, MySQL would be utterly useless.
Do you perhaps have outstanding uncommitted transactions causing this? Without the appropriate locks on the relevant records, your second query could be grabbing stale data quite easily.
Most likely your application has stale data. This is fine, it's how many database applications work, but when you perform your update, instead of doing something like this:
UPDATE account
SET balance = :current_balance - 10
WHERE account_id = 1
You need to do something more like this:
UPDATE account
SET balance = balance - 10
WHERE account_id = 1
That way, you use the current balance from the database, even if somebody changed it in the mean time, instead of relying on stale application data.
If you want to only change the value if no one else has modified it, then you do something like this:
UPDATE account
SET balance = balance - 10
WHERE account_id = 1
AND balance = :current_balance
If the number of affected rows is 1, then you succeeded, the record hadn't changed by someone else. However, if the number of affected rows is 0, then somebody else changed the record. You can then decide what you want to do from there.
Locking the tables is the solution to your problem, I think :)
I have a website the users can do stuff like leave comments, perform likes, etc..
I want to calcluate a score for each user that is composed from these actions, i.e: Like = 10 points, comment = 20 points etc..
I would like to create a table with users score that will be calculated once a day.
Currently i use a php script and execute, but it takes too long to calculate and then it times-out..
whats the method for perfomring this?
basically what you do is add a record of what the user in your web application, instead of calculating every batch, you can:
save each action along with its own score and then with a simple query that you do every day, update the user's score
keep independent of each action, you can directly add up the score every day user
TIP
as you have planned to update the score every day user can add a cache for a long duration so that the query does not harm performance
How long time does the script take?
If you just want to increase the amount of time the script is allowed to take to execute, you can make use of PHP's set_time_limit() to increase the time.
Start your script by running:
set_time_limit(120); // Script is allowed to execute for 2 minutes if necessary