Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I am wondering, when you want to make a php based games, that requires the player to wait for something, for example: I paid 100 gold to explore, and every 5 minutes I will receive loot. The exploration will ends in 30 minutes for example. I want to know, which is the best and why. Here are the options:
Keep record of starting time of the exploration command issued, then every time the one specific exploring player open the page, calculate everything and show the result then keep it in the database.
Make a cron job to calculate exploration of EVERY player currently exploring every 5 minutes and update it to database.
Make a cron job every 30 minutes to calculate and update everything for EVERY PLAYER, but also allow SPECIFIC PLAYER to update just like option 1.
option 3 is basically combination of option 1 and 2. Thanks for the help. I am not sure about the performance issue so I need to know from people who already had experience in this.
These are just some personal opinion, might not be the best choice.
2) is more of a general approach for multiplayer game that has player interaction, but it puts constant strain on the server, which seems to be over kill as I seriously doubt your game would have complex interaction between players.
1) is probably the way to go unless your calculation is very complex and take a long time. The possible drawback is that you'll probably have trouble handling lots of simultaneous request to update. But from what you describe I don't think that'll happen.
3)This is hard to comment on because I have no idea if your calculation-time would depends on how much time it has pass since last update. If you calculation is time-indepentdent, then it's a horrible method as you spend time to update data that no one might need AND you are open to traffic spike as well.
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 22 days ago.
Improve this question
I am building an application in PHP that requests data from a third-party API, stores and processes that data and then submits additional API requests based on the data received from the first request.
The issue is that there are several rate limits and where there is a large volume of data to be requested, I need to make many paginated API requests in 2-second intervals so that I can avoid getting blocked by the rate limits. Essentially, the programme keeps looping through making APi requests every 2 seconds until there is no longer a next page URl in the response header.
Depending on the amount of data, it could take several minutes, up to several hours. I can increase the max execution time in PHP.ini, but this is not efficient and could still result in a timeout if one day the program has too much data to work with.
I'm sure there must be a better way to manage this, possibly with serverless functions, or some kind of queuing system to run in the background. I have never worked with serverless functions, so it will be a learning curve, but happy to learn if needed.
I would love to hear what anyone thinks the best solution is. I am building the application in PHP, but I can work with JS, or NodeJs if I need to.
Many thanks in advance.
You can use queue for that. There are plenty of packages, and you can choose one depends on your needs.
Also, you can use Asynchronous requests maybe from guzzle or some other vendors (which speedup reading process) and also you can easily implement delay retry middleware for rate limiter.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
We're using PHP 7 and have a MySQL DB running on a Webserver with only 128 MB RAM.
We have a problem with processing tons of datasets.
Simple description: We have 40.000 products and we want to collect data to these products to find out, if they need to be updated or not. The query which is collecting the specific data from another table with 10 Million datasets takes 1.2 seconds, because we have some SUM functions in it. We need to do the query for every product individually, because the time range which is relevant for the SUM, differs. Because of the mass of queries the function which should iterate over all the products returns a time out (after 5 min) - that's why we decided to implement a cronjob, which calls the function and the function continues with the product it ended the last time. We call the cronjob every 5 min.
But still, with our 40.000 products, it takes us ~30 hours until all the products were processed. Per cronjob, our function processes about 100 products...
How is it possible to deal with such a mass of data - is there a way to parallelize it with e.g. pthreads or does somebody have another idea? Could a server update be a solution?
Thanks a lot!
Nadine
Parallel processing will require resources as well, so on 128 MB it will not help.
Monitor your system to see where the bottleneck is. Most probably the memory since it is so low. Once you find the bottleneck resource, you will have to increase it. No amount of tuning and tinkering will solve an overloaded server issue.
If you can see that it is not a server resources issue (!), it could be at the query level (to many joints, need some indexes, ...). And your 5 min. timeout could be increased.
But start with the server.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I am building a browser php game.
There will be resources like metal, wood, food etc. Players will be getting the resources all the time (gaining resources speed depending on buildings/mines/farms levels).
The number of resources is saved in database resources table.
Lets say now that someone will be getting 50 000 of metal hourly.
What is the best way to save these values to database or recalculate them?
It would be crazy to add these values to the resources table every second to keep it updated. How to do it best?
If you can afford a stateful design, I have found that it is usually best to keep and maintain them in session, and aggregate the changes and write them out to the database at set intervals (of say 10 minutes), or when the session ends.
High rates of update can kill database performance: this impact is multiplied when the table you're writing to has any significant indexing. Different databases can support different transaction rates, and if you have more than a couple users, once-per-second updates will just kill performance.
An alternative is to write out these updates to a local or temporary queue table, containing only an index on the autoincrement field, and to have a sweeper process blow through it periodically to add those updates to the eventual target table at low priority. This keeps the update overhead lower, and reduces contention to the critical table, but it also means that your application logic will have to read the database value, and add the "pending" changes, before it receives a usable value.
A last alternative that is kind of the midpoint of the two above ones is to use a queue for storing pending database changes, but it would make it more difficult to calculate point-in-time values when there are unwritten changes still in the queue.
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 7 years ago.
Improve this question
I want to develop a currency system on a custom forum I've been working on, but I don't know the best approach.
Should I add a new "gold" field to my user table and increment with sql statements?
id, user, pass, created_at, gold
Logic: user creates new forum post; update user table: gold + 1
OR
Should I add a transactions table that logs everything and do a count where user = x?
id, user_id, amount
1 3 1 (new forum post)
2 3 1 (new forum post)
3 12 -5 (item purchase)
4 3 -1 (deleted post)
5 9 1 (new forum post)
OR is there an even better approach?
It highly depends on what you want to do with it and which way to program you prefer.
To approach it with some facts though:
I expect a forum to be fast. For that you should only use simple Select. Functions like SUM() take a bit more time to perform. In a small system that will most likely not be a problem, but mysql-db usually scale very bad, so you should keep that in mind from the beginning.
You definitely want a way to track transactions. Mostly to be able to check what is actually going on. Even if you make a great system to deal with your gold you still want to be track what happened from time to time. For that it's handier to store transactions.
Redundant data and transaction synchronization can be a problem. Every transaction system has the problem to keep everything synchronized. With MySQL that's not so difficult, as tables can be locked while you perform transactions. But redundant data is way more of a pain. You have to ensure that you change data everywhere at the same time before other actions can interfere.
On a basic system I would store the data in the user-table and keep all transactions as a log in another table. But never use that for an output to the user. For any further it depends on what your system needs.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
What is the best way to send e-mails or perform functions on a small, non-real money auction script? This script is a learning exercise for me and I was wondering what the best way would be to process actions when the auction has expired. A cron job every minute seems - to me - a method which can easily be surpassed.
To be honest, this is an ideal situation for a cronjob. Theoretically, you could create cronjobs on the fly...That is edit the crontab with php and create an entry for each auction with their end time to execute a generic script that has some variables passed to it.
A cronjob every minute seems a bit extreme, but if you space it out a bit the idea seems very reasonable.
Alternatives would be for if someone hits the auction page that the script checks to see if the auction is expired and then sends an email and sets a flag in a table column. This overall is a terrible idea (What if no one visits the page for a few days).
Ideally the cronjob is your best friend here. Now you can go with an hourly style cronjob or create a script that generates one time cronjobs on the fly. The best solution though would be a recurring cronjob (per 10 minutes perhaps?) and as it sends out an email for an auction, a flag is set in an column ie email_sent to 1 or something similar so that emails aren't resent erroneously.