I am very new to learning PHP. I am trying to create a PHP script that would handle multiple GET request which are JSON encoded coming from a clients software on a single TCP connection to the PHP script simultaneously.
Whilst reading through I encountered the "HTTP pipe-lining, processing requests in parallel" article on StackOverflow. Well I would want to process the requests as and when they arrive. By design the requests are pipe-lined, hence the requests are processed one by one.
The problem here being if the client software makes 100 requests to the PHP script with a difference of a few milliseconds, my PHP script would take some time to process each request and eventually add on immense amount of time before the last request is processed and sent back to the requesting entity.
I am using $_GET method for retrieving the requests. I have looked for this information and don't seem to find anything substantial. I would appreciate any help on this. Could anyone kindly guide me in the right direction.
Thank you in advance.
If you are using a web server, like Apache, this is handled for you, in the exact manner you are describing.
Related
I'm really struggling to understand the way PHP asynchronous requests work. It seems to me that there cannot be any true asynchronous behaviour in PHP because to receive a response from an asynchronous request, you have to delay execution of the script (blocking, essentially) which then defeats the purpose.. right?
What I'm trying to do from a page on my site is:
Initiate AJAX request to (local) controller to begin the remote API
requests
Use a loop of GuzzleHTTP requests to remote API
Upon completion of each request, send the result to a socket.io server which then emits a message back to the listening client (the original page that initiated the AJAX request)
Now, I have this working - with one caveat: the GuzzleHTTP requests are not asynchronous and the remote API is very slow (usually takes around 10 seconds to receive the JSON response) so when there are 10 tasks, my connection to the server is frozen for over a minute. Opening new tabs/windows and trying to access the site results in waiting until the original script run has completed.
This seems to indicate that I misunderstood how HTTP requests work. I thought that each individual browser window (or request) was completely separate as far as the server is concerned but from the result I'm seeing, perhaps this is not exactly the case?
This is why I started looking into asynchronous requests and discovered that they don't really seem all that.. asynchronous.
So what I'm really looking for is help filling the gaps/misunderstandings in my knowledge of:
Why the blocking occurs in the first place (in terms of new, completely separate requests)
Where I might look to implement a solution.
I have a MS sql server that solves queries sent by a system made in PHP, everything under windows. the problem i have right now is that if a query takes a long time o proccess, all the remaining incoming request made by other users won't be processed until php get the results from the first one and finishes the first request.
is there a way to allow/make php to handle parallel, simultaneously many request? because right now i have a very big bottleneck since the sql server can handle a lot of simultaneous queries but the web application can just send the block of queries request by request.
if neccesary i can use solutions based on linux too
Finally i solved my problem using the function session_write_close()
since the session file was the same for all the request and php was locking it, all the request get stucked
I'm currently working on an event-logging system that will form part of a real-time analytics system. Individual events are sent via rpc from the main application to another server where a separate php script running under apache handles the event data.
Currently the receiving server PHP script hands off the event data to an AMQP exchange/queue from where a Java application pops events from the queue, batches them up and performs a batch db insert.
This will provide great scalability however I'm thinking the cost is complexity.
I'm now looking to simplify things a little so my questions are:
Would it be possible to remove the AMQP queue and perform the batching and inserting of events directly to the db from within the PHP script(s) on the receiving server?
And if so, would some kind of intermediary database be required to batch up the events or could the batching be done from within PHP ?
Thanks in advance
Edit:
Thanks for taking the time to respond, to be more specific. Is it possible for a PHP script running under Apache to be configured to handle multiple http requests?
So, as Apache spawns child processes each of these processes would be configured to accept say 1000 http requests, deal with them and then shut down?
I see three potential answers to your question:
Yes
No
Probably
If you share metrics of alternative implementations (because everything you ask about is techncially possible so please do it first and then get hard results) we can give better suggestions. But as long as you don't provide some meat, put it on the grill and show us the results, there is not much more to tell.
I've got a small php web app I put together to automate some manual processes that were tedious and time consuming. The app is pretty much a GUI that ssh's out and "installs" software to target machines based off of atomic change #'s from source control (perforce if it matters). The app currently kicks off each installation in a new popup window. So, say I'm installing software to 10 different machines, I get 10 different pop ups. This is getting to be too much. What are my options for kicking these processes off and displaying the results back on one page?
I was thinking I could have one popup that dynamically created divs for every installation I was kicking off, and do an ajax call for each one then display the output for each install in the corresponding div. The only problem is, I don't know how I can kick these processes off in parallel. It'll take way too long if I have to wait for each one to go out, do it's thing, and spit the results back. I'm using jQuery if it helps, but I'm looking mainly for high level architecture ideas atm. Code examples are welcome, but psuedo code is just fine.
I don't know how advanced you are or even if you have root access to your server which would be required, but this is one possible way.. it uses several different technologies, and would probably be suited for a large scale application rather than a small. But I'll advise you on it anyway.
Following technologies/stacks are used (in addition to PHP as you mentioned):
WebSockets (on top of node.js)
JSON-RPC Server (within node.js)
Gearman
What you would do, is from your client (so via JavaScript), when the page loads, a connection is made to node.js via WebSockets ) you can use something like socket.io for this).
Then when you decide that you want to do a task, (which might take a long time...) you send a request to your server, this might be some JSON encoded raw body, or it might just be a simple GET /do/something. What is important is what happens next.
On your server, when the job is received, you kick off a new job to Gearman, by adding a Task to your server. This then processes your task, and it will be a non blocking request, so you can respond immediately back to the client who made the request saying "hey we are processing your job".
Then, your server with all of your Gearman workers, receives the job, and starts processing it. This might take 5 minutes lets say for arguments sake. Once it has finished, the worker then makes a JSON encoded message which it sends to your node.js server which receives it via JSON-RPC.
After it grabs the message, it can then emit the event to any connections which need to know about it via websockets.
I needed something like this for a project once and managed to learn the basics of node.js in a day (having already a strong JS background). The second day I was complete with a full push/pull messaging job notification platform.
I've finally made a simple chat page that I had wanted to make for a while now, but I'm running into problems with my servers.
I'm not sure if long polling is the correct term, but from what I understand, I think it is. I have an ajax call to a php page that checks a mysql database for messages with times newer than the time sent in the ajax request. If there isn't a newer message, it keeps looping and checking until there is. Else, it just returns the new messages and the client script sends another ajax request as soon as it gets the messages.
Everything is working fine, except for the part where the server on 000webhost stops responding after a few chat messages, and the server on x10 hosting gives me a message about hitting a resource limit.
Maybe this is a dumb way to do a chat system, but it's all I know how to do. If there is a better way please let me know.
edit: Holy hell, it's just occurred to me that I didn't put any sleep time in the while loop on the server.
You can find a lot of reading on this, but I disbelieve that free web hosting is going to allow to do what you are thinking of doing. PHP was also not really designed to create chat systems.
I would recommend using WebSockets, and use for example, Node.JS with Socket.IO, or Tornado with Python; There is a lot of solutions out there, but most of them would require you to run your own server since it requires to run a whole program that interacts with many connections at once instead of simple scripts that just start and finish with a single connection.
What about using the same strategy whether there are newer messages on the server or not. The server would always return a list of newer messages - this list could be empty when there are no newer messages. The empty list could be also be encoded as a special data token.
The client then proceeds in both cases the same way: it processes the received data and requests new messages after a time period.
Make sure you sleep(1) your code on each loop, the code gonna enter the loop several times per second, stressing your database/server.
But still, nodejs or websockets are better tecnologies to deal with real time chats.