Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 22 days ago.
Improve this question
I am building an application in PHP that requests data from a third-party API, stores and processes that data and then submits additional API requests based on the data received from the first request.
The issue is that there are several rate limits and where there is a large volume of data to be requested, I need to make many paginated API requests in 2-second intervals so that I can avoid getting blocked by the rate limits. Essentially, the programme keeps looping through making APi requests every 2 seconds until there is no longer a next page URl in the response header.
Depending on the amount of data, it could take several minutes, up to several hours. I can increase the max execution time in PHP.ini, but this is not efficient and could still result in a timeout if one day the program has too much data to work with.
I'm sure there must be a better way to manage this, possibly with serverless functions, or some kind of queuing system to run in the background. I have never worked with serverless functions, so it will be a learning curve, but happy to learn if needed.
I would love to hear what anyone thinks the best solution is. I am building the application in PHP, but I can work with JS, or NodeJs if I need to.
Many thanks in advance.
You can use queue for that. There are plenty of packages, and you can choose one depends on your needs.
Also, you can use Asynchronous requests maybe from guzzle or some other vendors (which speedup reading process) and also you can easily implement delay retry middleware for rate limiter.
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
It is possible to define variable in PHP and call it for all users connected on server?
I need variable, or object for store informations in RAM of the server without using database or server file system.
Save the data to the variable in one computer, and call them back in another connected computer.
What is the best practice, is it possible?
Roughly - yes, it is possible.
In order to do that you need to have access to RAM which I haven't seen in PHP done directly, not sure if is possible or not, you can research this yourself.
What you can do is, however, since PHP uses memory to run, you can take advantage of that and create a php script that will run forever and act as a server, that is going to use it's ability to write and read memory and is going to be an amazingly simple job since PHP handles that for you automatically and you would not have to bother with addresses and stuff ( describing a simple variable declaration ). In order to access this running script you will need to examine how sockets work and how to establish a server-client connection. That is very well explained in this article.
However, I do not mean to be rude, but by the way you form your question I can make an assumption that this may be too much for you, so I guess what you can do is use MemcacheD or any other in-memory caching mechanism that is already built by people better at coding than me and you. There is plenty of information out there, just search for in-memory caching mechanisms.
Good luck!
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I am wondering, when you want to make a php based games, that requires the player to wait for something, for example: I paid 100 gold to explore, and every 5 minutes I will receive loot. The exploration will ends in 30 minutes for example. I want to know, which is the best and why. Here are the options:
Keep record of starting time of the exploration command issued, then every time the one specific exploring player open the page, calculate everything and show the result then keep it in the database.
Make a cron job to calculate exploration of EVERY player currently exploring every 5 minutes and update it to database.
Make a cron job every 30 minutes to calculate and update everything for EVERY PLAYER, but also allow SPECIFIC PLAYER to update just like option 1.
option 3 is basically combination of option 1 and 2. Thanks for the help. I am not sure about the performance issue so I need to know from people who already had experience in this.
These are just some personal opinion, might not be the best choice.
2) is more of a general approach for multiplayer game that has player interaction, but it puts constant strain on the server, which seems to be over kill as I seriously doubt your game would have complex interaction between players.
1) is probably the way to go unless your calculation is very complex and take a long time. The possible drawback is that you'll probably have trouble handling lots of simultaneous request to update. But from what you describe I don't think that'll happen.
3)This is hard to comment on because I have no idea if your calculation-time would depends on how much time it has pass since last update. If you calculation is time-indepentdent, then it's a horrible method as you spend time to update data that no one might need AND you are open to traffic spike as well.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I would like to be able to make synchronous server requests for a game I plan on making. I've used AJAX's synchronous calls in the past sparingly because they are slow and I know that AJAX isn't cut out for this sort of task.
My reason for this is because I want it to be (as much as possible) hack-proof. For example, if a person buys an item. It will send a request to the server that they wish to buy the item and on that end it will check if they have enough of the currency and then send back if it's OK to allow them to buy it. Using this example, it'd obviously be a pain if it took several seconds each time you try to buy something.
The client side would be HTML5/JS and the server side would be PHP/SQL. What would be the best method to achieve this? Before anyone says "show me your code": I'm not asking for help on fixing something broken. I'm asking for a suggestion on the best way to access a server quickly and synchronously. If it isn't possible faster, then such an answer would suffice.
I'd start by building the most basic approach: simple PHP script with minimal dependencies that only loads what is required to validate the request, connect to database, and make the query so it can return the result.
Then test its performance. If that's insufficient, I'd start looking at websockets or other technology stacks for just the fast-access portions (maybe node.js?).
Making the request run synchronously doesn't make it faster. It just means that the browser is going to become unresponsive until the request is complete. Making it fast is a matter of writing your server-side code such that it can run quickly, and without any details we can't tell you how to do that.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I'm writing my first AJAX driven website and some parts of it are kind of slow. For example, I'll click a button to change the page, and it might take three seconds for the new page to appear.
Chrome developer tools shows the network activity for the page being changed as follows:
DNS Lookup 1 ms
Connecting 45 ms
SSL 21 ms
Sending 0
Waiting 1.89 s
Receiving 73 ms
The size of the above request was 49.1 KB.
Clearly the "Waiting" time is where the slowdown is occurring. My question is, what is causing this "waiting" time? Is it something to do with the jQuery AJAX request, or is it because the MySQL database is slow, or something in a PHP file causing a delay?
Without seeing my project and debugging it first-hand, you may not be able to answer that question. If that's the case, how should I go about determining which part of the application is slowing things down?
Without seeing my project and debugging it first-hand, you may not be able to answer that question. If that's the case, how should I go about determining which part of the application is slowing things down?
That depends on your debug tools. At the most basic level, comment out parts of your server-side code and check how much the "waiting" time drops.
I don't know anything about profiling MySQL/PHP applications (in Django, you could use django-debug-toolbar), but Ajax queries are good candidates to cache in both DB and app output layers.
Consider using a cache system like memcached.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
I have a special question well I need to know technologies behind monitoring website like pingdom.com
What is the beneficial language to use to develop this platform ? what library we can use ? What about distributed solution ?
Thank you in advance
I think this is more of an opinion than a question (which is why it's received negative votes). This type of service could be developed in many languages. It's based on a few key principals.
Can the site be reached? Yes/No
What determined that the site could not be reached? Did it timeout?
What is the max timeout that we will allow before we decide its unreachable?
How does the site fair from other sources, worldwide, well we'll need to run the same code from multiple servers, and know where they are. This could be done with something like amazon which lets you set them up from all over the world.
How does this fair compared to the last time we pinged it? (we'll need to store the result in a database for reference and feedback to the user)
You could easily make this sort of service in something like PHP, but like I said, it could be made in many languages, hence why this is an opinion as theres no real clear answer to this.
The biggest problem a site like pingdom.com has has
- Number of requests (they use a queuing system to stop their servers being overloaded)
- They make cash by monitoring sites all the time. So they need to make sure the value add revenue generated outweighs the cost of free users (leads they generate), which it clearly would, as they are so cheap.
Hope this helps.