Fast synchronous server requests with JS [closed] - php

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I would like to be able to make synchronous server requests for a game I plan on making. I've used AJAX's synchronous calls in the past sparingly because they are slow and I know that AJAX isn't cut out for this sort of task.
My reason for this is because I want it to be (as much as possible) hack-proof. For example, if a person buys an item. It will send a request to the server that they wish to buy the item and on that end it will check if they have enough of the currency and then send back if it's OK to allow them to buy it. Using this example, it'd obviously be a pain if it took several seconds each time you try to buy something.
The client side would be HTML5/JS and the server side would be PHP/SQL. What would be the best method to achieve this? Before anyone says "show me your code": I'm not asking for help on fixing something broken. I'm asking for a suggestion on the best way to access a server quickly and synchronously. If it isn't possible faster, then such an answer would suffice.

I'd start by building the most basic approach: simple PHP script with minimal dependencies that only loads what is required to validate the request, connect to database, and make the query so it can return the result.
Then test its performance. If that's insufficient, I'd start looking at websockets or other technology stacks for just the fast-access portions (maybe node.js?).

Making the request run synchronously doesn't make it faster. It just means that the browser is going to become unresponsive until the request is complete. Making it fast is a matter of writing your server-side code such that it can run quickly, and without any details we can't tell you how to do that.

Related

How to best optimise PHP program so it doesn't timeout? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 22 days ago.
Improve this question
I am building an application in PHP that requests data from a third-party API, stores and processes that data and then submits additional API requests based on the data received from the first request.
The issue is that there are several rate limits and where there is a large volume of data to be requested, I need to make many paginated API requests in 2-second intervals so that I can avoid getting blocked by the rate limits. Essentially, the programme keeps looping through making APi requests every 2 seconds until there is no longer a next page URl in the response header.
Depending on the amount of data, it could take several minutes, up to several hours. I can increase the max execution time in PHP.ini, but this is not efficient and could still result in a timeout if one day the program has too much data to work with.
I'm sure there must be a better way to manage this, possibly with serverless functions, or some kind of queuing system to run in the background. I have never worked with serverless functions, so it will be a learning curve, but happy to learn if needed.
I would love to hear what anyone thinks the best solution is. I am building the application in PHP, but I can work with JS, or NodeJs if I need to.
Many thanks in advance.
You can use queue for that. There are plenty of packages, and you can choose one depends on your needs.
Also, you can use Asynchronous requests maybe from guzzle or some other vendors (which speedup reading process) and also you can easily implement delay retry middleware for rate limiter.

PHP and SQL basic instant messaging [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I am thinking of creating a simple web based instant messaging using a combination of PHP and SQL. To keep it simple I was thinking of not sending the message to the other clients browser using COMET or AJAX, but simply uploading it to a SQL database. The other clients computer will then periodically refresh the webpage which will cause the PHP code on the server to check for and return any new messages.
Would this method be simply to slow to be actually useful?
Thanks in advance :)
That depends on the scope of your project. If you're thinking of server a thousand users, this is not a recommended method. If you want to chat with your 5 colleagues on an internal LAN: it doesn't really matter much. It will be fast and work just fine.
You could also consider building it with jQuery + PHP + SQL though; read up on jQuery a bit and you'll be amazed by the power of its AJAX functions.
Also, if you're lazy or simple don't have enough time, use a premade library like this one here and i'm sure there are many more to be found on the internet.

How to determine what part of an AJAX application is slowing things down? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I'm writing my first AJAX driven website and some parts of it are kind of slow. For example, I'll click a button to change the page, and it might take three seconds for the new page to appear.
Chrome developer tools shows the network activity for the page being changed as follows:
DNS Lookup ​1 ms
Connecting ​45 ms
SSL ​21 ms
Sending ​0
Waiting ​1.89 s
Receiving ​73 ms
The size of the above request was 49.1 KB.
Clearly the "Waiting" time is where the slowdown is occurring. My question is, what is causing this "waiting" time? Is it something to do with the jQuery AJAX request, or is it because the MySQL database is slow, or something in a PHP file causing a delay?
Without seeing my project and debugging it first-hand, you may not be able to answer that question. If that's the case, how should I go about determining which part of the application is slowing things down?
Without seeing my project and debugging it first-hand, you may not be able to answer that question. If that's the case, how should I go about determining which part of the application is slowing things down?
That depends on your debug tools. At the most basic level, comment out parts of your server-side code and check how much the "waiting" time drops.
I don't know anything about profiling MySQL/PHP applications (in Django, you could use django-debug-toolbar), but Ajax queries are good candidates to cache in both DB and app output layers.
Consider using a cache system like memcached.

Considering usage of server-side JavaScript [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
Recently, i heard alot of JavaScript being used sever-side. Node.js, express.js and similiar stuff are mentioned more and more, but i never had the muse or time to dig deeper into the topic.
Now, as the information flood would not decrease, i can not longer ignore those circumstances.
I wonder about two things:
Can i replace my complete, simple PHP backend, which is primary used only for database access, with server-side Javascript
and if yes:
are there any benefits of doing so?
and if not:
Why is there such a hype?
The topic seems to be quite complex and not too easy to learn, but as time goes by, i more and more get the feeling that this maybe will be the future of backend coding.
Yes you can.
If you are primarily serving data, a more contemporary approach would be to use node.js to implement a restful api . Node.js is particularly suited for this as it inherently works asynchronously - which means each request to the data source (ie the database) inherently does not block while the server is waiting to return, allowing it to punch well above it's weight in terms of being efficient when servicing many requests.
You could use the node modules express.js or restify.js to implement this.
A warning though - node.js is a single threaded application which means some work has to be carried out before is scale able. There are some good solutions for this, such as using Amazon Elastic beanstalk. But as node.js is a relative newcomer many other proposed solutions may need some coaxing to be production ready.
You may find it beneficial to read 'Javascript the good parts' by Douglas Crockford before you begin - something I needed to bring my knowledge of Javascript to a level where I could write quality maintainable code for node.js
Yes you can replace it.
Main concepts about Node you have to know is being async, second is being event-driven.
So if your PHP app just accesses db and serve responses back, node.js would be more efficient in such applications, as would not block idling for response from db, but can process with other requests and so on.
It is not complicated, if you do it. Just go and dive into. Don't ask - prototype. It is the best way to understand if you really need it or not.
I've replaced all my PHP need to node.js, except templating.

PHP 5 - decoupling important scripts using cURL/socket calls, good idea? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
I have an IPN script for PayPal, and as I can only rely on a log file to debug it and it's hard to maintain a testing server (and to test even for I am a one-man team), I am considering using cURL to run other scripts which would handle sending of emails, logging into database and updating of logs.
This way, if I have to send a new email (for some reason, clients never made up their mind), I don't have to tangle with the IPN script. Not just that, if for some reasons those new additions cause a FATAL error, the original IPN still runs. Is this a good idea?
Why chew up valuable apache processes (assuming apache/mod_php)?
If you want to do things asynchronously, cURL won't help, since it doesn't operate like that. You might look at kicking off some external scripts via the command line, if you want to do some things in a fire-and-forget sort of way.
Otherwise, what's wrong with just abstracting this peripheral (to IPN handling) activity like you would any other abstraction? Wrapping it in a function being the most obvious thing.
Then if you need to add some new feature (sending a new kind of email, for example), you just write a function that sends that email, test it until it works, then add a single line to your IPN-handling script.
Maybe I'm missing something?
Responding to your edit: Obviously, you should avoid fatal errors at all costs. But you should be able to avoid them without resorting to these kinds of heroics. For example, if you need to send an email, then write a function to send it based on some parameters. Ensure that function won't create fatal errors by writing enough test coverage.

Categories