Say we are a site receiving massive amounts of traffic, Amazon.com size traffic. And say we wanted to display a counter on the home page displaying the total number of sales since December the first and the counter was to refresh via ajax every 10 seconds.
How would we go about doing this?
Would we have a summary database table displaying the total sales and each checkout would +1 to the counter and we would get that number every 10 seconds? Would we COUNT() the entire 'sales' table every 10 seconds?? Is there an external API I can push the stats off to and then do an ajax pull from them?
Hope you can help, Thanks
If your site is ecomm based, in that you are conducting sales, then you MUST have a sales tracking table somewhere. You could simply make the database count part of the page render when a user visits or refreshes your site.
IMO, there is no need to ajax this count as most visitors won't really care.
Also, I would recommend this query be run against a readonly (slave) database if your traffic is truly at amazon levels.
I would put triggers on the tables to manage the counter tables. When inserting a new sale the sum table would get the new value added to the row for the current day. That also gives sales per day historically without actually querying the big table.
Also, it allows for orders to be entered manually for other dates than today and that day get updated statistics.
As for the Ajax part that's just going to be a query into that sum table.
Whatever you do, do not re-COUNT everything every 10 seconds. Why not to have a cronjob, which does the counting of data every 10 seconds? It could take current time-10 seconds and in slave database add the difference to current count ?
Still 10 seconds sound bizarre. Every minute, mm?
Related
I created a sort of voting website with PHP and a MYSQL database. People can vote on id's that are in the database and the amount of likes goes up.
When you click on a button the likes of that specific id goes up by one. This is done through Ajax and sql.
Is it possible to set a limitation of the amount of likes an id can have for each day. For instance each id can update only 10 times each day. And the next day another 10 times.
Yes, you can put condition in ajax code.
Before inserting record in DB for vote first check the total number of vote received for that ID on current day.
If it is equal to 10 then skip the operation or else increase the vote.
You have to put this conditions in this file /10jaar/views/submissions.php
Not a big deal i think so.
There is a really serious issue about Double Entry Accounting systems with pagination, I think it is common but I still didn't find any solution for my problem yet.
You can use this link to read about the simple Double Entry Accounting systems just like the one I made with Laravel and AngularJS.
In this system, the expected result (for example) is something like this:
ID In Out Balance
1 100.00 0.00 100.00
2 10.00 0.00 110.00
3 0.00 70.00 40.00
4 5.00 0.00 45.00
5 0.00 60.00 -15.00
6 20.00 0.00 5.00
It is very easy to track the balance inside a cumulative function if you were showing all the transactions in one page, the balance in the last transaction is your current balance at the end of the day.
For example, for a specific range of dates $fromDate->$toDate, we do like:
$balanceYesterday = DB::table('journal')->where('date', '<', $fromDate)
->join('transactions','transactions.journal_id','journal.id')->where('transactions.type', "=", 0) /* 0 means the account of the company */
->select(DB::raw('SUM(amount) as total_balance'))
->first()->total_balance;
Now we have balance from yesterday, we depend on it to calculate the balance after that in a cumulative loop until the end of the process, reaching $toDate;
$currentBalance = $currentBalance + $currentTransaction->amount;
$currentTransactionBalance = $currentBalance;
Now the real problem starts when you have a big amount of transactions, and you need to paginate them $journal = $journal->paginate(100);, let's say 100 transactions per page, the system will work as expected for the first page, as we already can calculate the $balanceYesterday and depend on it to calculate the new balance after every transaction to the end of the 100 transactions in the first page.
Next page will have the problem that it doesn't know what was the last balance at the previous transaction in the first page, so it will start again from $balanceYesterday, making the whole table have wrong calculations.
What I did first to fix, was transferring the last transaction amount (in front-end) to the next page as a parameter, and use it as a starting amount to calculate again, and that was the best solution I had as I was using only << PREV and NEXT >> buttons, so it was very easy to fix it like that.
But I lately found out that this workaround will not work if I have a pagination with page numbers, as the user would like to go through pages to explore the journal, now it is impossible to know the last balance at a specific page, and the system will show wrong calculations.
What I am trying to do is finding a way to calculate the balance at a specific transaction, weather it was a credit or debit, I'm looking for a way to know how much the balance was after a specific transaction is done in a specific date, I DON'T WANT TO ADD A NEW BALANCE COLUMN AND SAVE THE BALANCE INSIDE IT, THE USER IS DOING A LOT OF MODIFICATIONS AND EDITS TO THE TRANSACTIONS FROM TIME TO TIME AND THAT WILL BREAK EVERYTHING AS A SMALL AMOUNT MODIFICATION WILL AFFECT ALL THE BALANCES AFTER IT, I CAN NOT depend on IDs of transactions in any method because transactions might have different random dates, so there will be no ordering by ID but there might be ordering by other fields like date or account owner or type or whatever..
I've been scratching my head on this for about 4 months, I searched online and found no solutions, I hope after this long explanation that my problem is clear, and I hope somebody can help me with a solution, please..
Thank you.
I believe the only thing you really need at this point is to calculate the sum of all transactions from the beginning of the paginated data set (all records, not just the current page's) until one before the first record displayed on the current page.
You can get this by finding the number of transactions that occurred between the start of your entire data set and the current page's transactions, retrieving them via LIMIT, and adding them up.
The first thing you'll want to have is the exact constraints of your pagination query. Since we want to grab another subset of paginated records besides the current page, you want to be sure the results of both queries are in the same order. Reusing the query builder object can help (adjust to match your actual pagination query):
$baseQuery = DB::table('journal')
->join('transactions', 'transactions.journal_id', 'journal.id')
->where('date', '>', $fromDate)
->where('date', '<', $toDate)
->where('transactions.type', "=", 0)
->orderBy('date', 'asc');
// Note that we aren't fetching anything here yet.
Then, fetch the paginated result set. This will perform two queries: one for the total count of records, and a second for the specific page's transactions.
$paginatedTransactions = $baseQuery->paginate(100);
From here, we can determine what records we need to retrieve the previous balance for. The pagination object returned is an instance of LengthAwarePaginator, which knows how many records in total, the number of pages, what current page its on, etc.
Using that information, we just do some math to grab the number of records we need:
total records needed = (current page - 1) * records per page
Assuming the user is on page 5, they will see records 401 - 500, so we need to retrieve the previous 400 records.
// If we're on Page 1, or there are not enough records to
// paginate, we don't need to calculate anything.
if ($paginatedTransactions->onFirstPage() || ! $paginatedTransactions->hasPages()) {
// Don't need to calculate a previous balance. Exit early here!
}
// Use helper methods from the Paginator to calculate
// the number of previous transactions.
$limit = ($paginatedTransactions->currentPage() - 1) * $paginatedTransactions->perPage();
Now that we have the number of transactions that occurred within our data set but before the current page, we can retrieve and calculate the sum by again utilizing the base query:
$previousBalance = $baseQuery->limit($limit)->sum('amount');
Adding a highlight here to explain that using your database to perform the SUM calculations will be a big performance benefit, rather than doing it in a loop in PHP. Take advantage of the DB as often as you can!
Add this balance to your original "yesterday" balance, and you should have an accurate beginning balance for the paginated transactions.
Note: everything pseudo-coded from theory, may need adjustments. Happy to revise if there are questions or issues.
You should be able to formulate a truth statement for the balance for each record as long as you can tell what the order is to calculate the sum for the balance at each point within that ordered list.
For sure this come with a massive overhead as you need to query the whole table for each record you display, but first of all one must be able to do that. As you've shown in the example, you are as long as you do not paginate.
What you could do for pagination is to pre-calculate the balance for each record and store it in relation to the original record. This would de-normalize your data but with the benefit that creating the pagination is rather straight forward.
In the website im working on i need to add user points. Every user will have it's own points and maximum amount of points will be 200. And upon registration user gets 100 points. With various tasks user points will be deducted.
But main problem im struggling is how to add points to the user, since every user need to gets 1 point every hour unless he have 200 or more points.
My first thought was to do a cronjob where it will run every hour a script which will check if user is verified and if user have less than 200 points and add 1 point to every user.
But after some reading im thinking of different approach which i don't understand quite exactly. The better approach, less server resource consuming would be to run a function which will check every time when user login how many points he have and add appropriate number of points to him. Problem is i don't know how to set it, how to calculate how many points to add to user if he was offline like 8 hours and how to time it? Or even maybe use ajax with timer?
What would be your suggestion and approach to this ?
Edit: Just to add since you ask, users doesn't see each other points.
When a user does something, check the last time you gave them points. If it was 5 hours ago, give them 5 points. If it was 10 hours ago, give them 10 points. Etc. Implement caching so if a user visits your site 50 times in one hour, you don't have to check against the DB every time.
Anyway, short answer is, do the check when loading the user data, rather than automatically every hour for all users whether they are active or not.
UPDATE users
SET points = MIN(points + 1, 200)
I don't really see the problem with this script running as a cron. Would be more problem if you handled each event as transaction points, since you'd have to run something like:
# Generates a row each hour per uncapped user, which may become a lot
INSERT INTO transcations (points, type, created)
SELECT 1, 'HOURLY_INCOME', NOW()
FROM users
WHERE points < 200
Is it relevant for other users, or official/inofficial statistics to check what their current point is? This is quite relevant, since it won't work fully if it only updates upon login.
user_table
---------------
id | reg_date
1 | 2013-10-10 21:10:15
2 | 2013-10-11 05:56:32
Just look how many hours left after user registration, add 100 points:
SELECT
TIMESTAMPDIFF(HOUR, `reg_date`, NOW())+100 AS `p`
FROM
user_table
WHERE
id = 1
And then check in PHP if result more than 200 just show 200.
Hmm, from mysql 5.1 there is neat feature which is basically mysql cron called MySQL Event Scheduler, and i think ill go with that for now since cron script will be very easy to write, small and not time consuming.
All i need to do is to write
UPDATE users SET points = (points +1) WHERE points<200
And add it to mysql event recurring every hour.
I have a website the users can do stuff like leave comments, perform likes, etc..
I want to calcluate a score for each user that is composed from these actions, i.e: Like = 10 points, comment = 20 points etc..
I would like to create a table with users score that will be calculated once a day.
Currently i use a php script and execute, but it takes too long to calculate and then it times-out..
whats the method for perfomring this?
basically what you do is add a record of what the user in your web application, instead of calculating every batch, you can:
save each action along with its own score and then with a simple query that you do every day, update the user's score
keep independent of each action, you can directly add up the score every day user
TIP
as you have planned to update the score every day user can add a cache for a long duration so that the query does not harm performance
How long time does the script take?
If you just want to increase the amount of time the script is allowed to take to execute, you can make use of PHP's set_time_limit() to increase the time.
Start your script by running:
set_time_limit(120); // Script is allowed to execute for 2 minutes if necessary
I have a person's username, and he is allowed ten requests per day. Every day the requests go back to 0,and old data is not needed.
What is the best way to do this?
This is the way that comes to mind, but I am not sure if it's the best way
(two fields, today_date, request_count):
Query the DB for the date of last request and request count.
Get result and check if it was today.
If today, check the request count, if less than 10, update query database to ++count.
If not today, update the DB with today's date and count = 1.
Is there another way with fewer DB queries?
I think your solution is good. It is possible to reset the count on a daily basis too. That will allow you to skip a column, but you do need to run a cron job. If there are many users that won't have any requests at all, it is needless to reset their count each day.
But whichever you pick, both solutions are very similar in performance, data size and development time/complexity.
Just one column request_count. Then query this column and update it. As far as I know with stored procedures this may be possible in one single query. Even if not, it will be just two. Then create a cron job, that calls a script, that resets the column to 0 every day at 00:00.
To spare you some requests to the DB define
the maximum number of requests per day allowed.
the first day available to your application (date offset).
Then add a requestcount field to the database per user.
On the first request get the count from the db.
The count is always the number of the day multiplied with the maximum + 1 of requests per day plus the actual requests by that user:
day * (max + 1) + n
So if on first request the count from the db is actually higher than allowed, block.
Otherwise if it's lower than the current day base, reset to the current day base (in the PHP variable)
And count up. Store this value into the DB.
This is one read operation, and in case the request is still valid, one write operation to the DB per request.
There is no need to run a cron job to clean this up.
That's actually the same as you propose in your question, but the day information is part of the counter value. So you can do more with one value at once, while counting up with +1 per request still works for the block.
You have to take into account that each user may be in a different time zone than your server, so you can't just store the count or the "day * max" trick. Try to get the time offset and then the start of the user's day can be stored in your "quotas" database. In mySQL, that would look like:
`start_day`=ADDTIME(CURDATE()+INTERVAL 0 SECOND,'$offsetClientServer')
Then simply look at this time the next time you check the quota. The quota check can all be done in one query.