Apache, PHP multi thread over windows for concurrent users - php

I have a MS sql server that solves queries sent by a system made in PHP, everything under windows. the problem i have right now is that if a query takes a long time o proccess, all the remaining incoming request made by other users won't be processed until php get the results from the first one and finishes the first request.
is there a way to allow/make php to handle parallel, simultaneously many request? because right now i have a very big bottleneck since the sql server can handle a lot of simultaneous queries but the web application can just send the block of queries request by request.
if neccesary i can use solutions based on linux too

Finally i solved my problem using the function session_write_close()
since the session file was the same for all the request and php was locking it, all the request get stucked

Related

Optimize PHP+CURL connection times

I have PHP scripts doing CURL POST requests to distant Nginx servers via HTTPS (several times per second).
My issue is that each request needs 3 round-trips (TCP connection + SSL handshake) before the transfer can start, which significantly slows down the process.
Is there a way to do reduce this, for instance with some sort of "Keep-Alive" to avoid renegotiating TCP / SSL for each request?
Thank you!
There is no way to keep a connection alive between two different PHP execution as the PHP script "die" at the end (thus closing any open socket), the only way to do what you want to achieve would be to have a background PHP script that never stops, takes care of fetching the data and put them into a database or a file that you will be able to easily and rapidly query later.
On another topic making multiple HTTPS request per second is maybe not the most efficient way to do it, if you have the hand on the server you query you might want to use WebSockets, that would allow you to make multiple queries per second without any major performance issue
I hope this answered you question, have a good day

How to process multiple parallel requests from one client to one PHP script

I have a webpage that when users go to it, multiple (10-20) Ajax requests are instantly made to a single PHP script, which depending on the parameters in the request, returns a different report with highly aggregated data.
The problem is that a lot of the reports require heavy SQL calls to get the necessary data, and in some cases, a report can take several seconds to load.
As a result, because one client is sending multiple requests to the same PHP script, you end up seeing the reports slowly load on the page one at a time. In other words, the generating of the reports is not done in parallel, and thus causes the page to take a while to fully load.
Is there any way to get around this in PHP and make it possible for all the requests from a single client to a single PHP script to be processed in parallel so that the page and all its reports can be loaded faster?
Thank you.
As far as I know, it is possible to do multi-threading in PHP.
Have a look at pthreads extension.
What you could do is make the report generation part/function of the script to be executed in parallel. This will make sure that each function is executed in a thread of its own and will retrieve your results much sooner. Also, set the maximum number of concurrent threads <= 10 so that it doesn't become a resource hog.
Here is a basic tutorial to get you started with pthreads.
And a few more examples which could be of help (Notably the SQLWorker example in your case)
Server setup
This is more of a server configuration issue and depends on how PHP is installed on your system: If you use php-fpm you have to increase the pm.max_children option. If you use PHP via (F)CGI you have to configure the webserver itself to use more children.
Database
You also have to make sure that your database server allows that many concurrent processes to run. It won’t do any good if you have enough PHP processes running but half of them have to wait for the database to notice them.
In MySQL, for example, the setting for that is max_connections.
Browser limitations
Another problem you’re facing is that browsers won’t do 10-20 parallel requests to the same hosts. It depends on the browser, but to my knowledge modern browsers will only open 2-6 connections to the same host (domain) simultaneously. So any more requests will just get queued, regardless of server configuration.
Alternatives
If you use MySQL, you could try to merge all your calls into one request and use parallel SQL queries using mysqli::poll().
If that’s not possible you could try calling child processes or forking within your PHP script.
Of course PHP can execute multiple requests in parallel, if it uses a Web Server like Apache or Nginx. PHP dev server is single threaded, but this should ony be used for dev anyway. If you are using php's file sessions however, access to the session is serialized. I.e. only one script can have the session file open at any time. Solution: Fetch information from the session at script start, then close the session.

Handling multiple HTTP GET requests on same TCP connection parallely

I am very new to learning PHP. I am trying to create a PHP script that would handle multiple GET request which are JSON encoded coming from a clients software on a single TCP connection to the PHP script simultaneously.
Whilst reading through I encountered the "HTTP pipe-lining, processing requests in parallel" article on StackOverflow. Well I would want to process the requests as and when they arrive. By design the requests are pipe-lined, hence the requests are processed one by one.
The problem here being if the client software makes 100 requests to the PHP script with a difference of a few milliseconds, my PHP script would take some time to process each request and eventually add on immense amount of time before the last request is processed and sent back to the requesting entity.
I am using $_GET method for retrieving the requests. I have looked for this information and don't seem to find anything substantial. I would appreciate any help on this. Could anyone kindly guide me in the right direction.
Thank you in advance.
If you are using a web server, like Apache, this is handled for you, in the exact manner you are describing.

Event logging system in PHP batching events for db insert

I'm currently working on an event-logging system that will form part of a real-time analytics system. Individual events are sent via rpc from the main application to another server where a separate php script running under apache handles the event data.
Currently the receiving server PHP script hands off the event data to an AMQP exchange/queue from where a Java application pops events from the queue, batches them up and performs a batch db insert.
This will provide great scalability however I'm thinking the cost is complexity.
I'm now looking to simplify things a little so my questions are:
Would it be possible to remove the AMQP queue and perform the batching and inserting of events directly to the db from within the PHP script(s) on the receiving server?
And if so, would some kind of intermediary database be required to batch up the events or could the batching be done from within PHP ?
Thanks in advance
Edit:
Thanks for taking the time to respond, to be more specific. Is it possible for a PHP script running under Apache to be configured to handle multiple http requests?
So, as Apache spawns child processes each of these processes would be configured to accept say 1000 http requests, deal with them and then shut down?
I see three potential answers to your question:
Yes
No
Probably
If you share metrics of alternative implementations (because everything you ask about is techncially possible so please do it first and then get hard results) we can give better suggestions. But as long as you don't provide some meat, put it on the grill and show us the results, there is not much more to tell.

load balancing in php

I have a web service running written in PHP-MYSQL. The script involves fetching data from other websites like wikipedia,google etc. The average execution time for a script is 5 secs(Currently running on 1 server). Now I have been asked to scale the system to handle 60requests/second. Which of the approach should I follow.
-Split functionality between servers (I create 1 server to fetch data from wikipedia, another to fetch from google etc and a main server.)
-Split load between servers (I create one main server which round robin the request entirely to its child servers with each child processing one complete request. What about MYSQL database sharing between child servers here?)
I'm not sure what you would really gain by splitting the functionality between servers (option #1). You can use Apache's mod_proxy_balancer to accomplish your second option. It has a few different algorithms to determine which server would be most likely to be able to handle the request.
http://httpd.apache.org/docs/2.1/mod/mod_proxy_balancer.html
Apache/PHP should be able to handle multiple requests concurrently by itself. You just need to make sure you have enough memory and configure Apache correctly.
Your script is not a server it's acting as a client when it makes requests to other sites. The rest of the time its merely a component of your server.
Yes, running multiple clients (instances of your script - you don't need more hardware) concurrently will be much faster than running the sequentially, however if you need to fetch the data synchronously with the incoming request to your script, then coordinating the results of the seperate instances will be difficult - instead you might take a look at the curl_multi* functions which allow you to batch up several requests and run them concurrently from a single PHP thread.
Alternately, if you know in advance what the incoming request to your webservice will be, then you should think about implementing scheduling and caching of the fetches so they are already available when the request arrives.

Categories