I know PHP scripts can run concurrently, but I notice that when I run a script and then run another script after it. It waits for the first to finish and then it does what it has to do?
Is there an Apache config I have to change or do I use a different browser or something?!!
Thanks all
EDIT
If I go to a different browser, I can access the page/PHP script I need?! Is this limiting the number of requests by browser? Apache?
Two things comes to mind:
HTTP 1.1 has a recommend maximum limit of 2 simultaneous connections to any given domain. This would exhibit the problem where you can only open one page on a site at a time unless you switch browsers.
Edit: This is usually enforced by the client, not the server
If your site uses sessions, you'll find out that only one page will load at a time...
...unless you call session_write_close(), at which point the second page can now open the now-unlocked session file.
If you run php under fastcgi you can probably avoid this restriction.
I haven't had this problem with apache / php, the project I work on runs 3 iframes each of which is a php script + the main page, apache handles all 4 at the same time.
You may want to check your apache configuration and see how the threading is setup.
This article may help:
http://www.devside.net/articles/apache-performance-tuning
Related
I've got a long running php 7.2 script that is producing a zip file. I want to use a looping ajax call to check on the progress of building the zip file. The second script appears to be locked and doesn't start processing until the first script is completely done.
In fact, the second script doesn't even print the error_log() on line 1 of my index.php routing script until after the first script is completely finished.
The top of my index.php router script:
<?
error_log('top of index '.$_SERVER['REQUEST_URI']);
This is true even if I'm just requesting static image resources. The error_log() on line 1 doesn't even print until after the long-running script has completely finished.
At first, I believed I was running into Session Locking as described here but the solution they offer doesn't seem to work (calling session_write_close()), and I'm wondering if something else is going on because the second script is locking before line 1 as opposed to locking up when I try to start the session. The second script doesn't seem to be starting AT ALL. I thought maybe the server was automatically starting the session before line 1, but I checked and my server has session.auto_start=0.
Is there some server configuration I need to set?
What am I missing? What am I doing wrong?
I'm running this on localhost (Mac) with the built-in PHP server.
php -c /usr/local/etc/php/7.2/php.ini -S 127.0.0.1:8080 index.php
This is because the PHP server only uses a single process by default, which prevents it from executing multiple requests simultaneously.
The built-in server supports multiple workers from PHP 7.4 onward (on platforms where fork() is available). See this commit: https://github.com/php/php-src/commit/82effb3fc7bcab0efcc343b3e03355f5f2f663c9
To use it, add the PHP_CLI_SERVER_WORKERS environment variable before starting the server. For example:
PHP_CLI_SERVER_WORKERS=10 php7.4 -S 127.0.0.1:7080 index.php
I recommend upgrading your PHP 7.2 installation if you want to utilize this feature.
The built in web server is a single-threaded, single-process, synchronous server. It's only meant for testing purposes. It can only handle a single request at a time. It cannot serve multiple concurrent requests at the same time.
I have AMPPS installed.
My Apache server cannot handle multiple php requests at once (for example if I call localhost/script.php multiple times, they are processed in a consecutive order). script.php consists only of <?php sleep(10); ?>.
I read that MaxClients directive is responsible for concurrent access configuration, but it is missing in my httpd.conf at all.
Disabling Xdebug and writing session_write_close(); to the beginning of the script didn't work.
When I added session_start(); to the beginning of the file and my code looked like:
<?php
session_start();
session_write_close();
sleep(10);
phpinfo();
echo "Done";
When making 5 requests to localhost/script.php, last 4 waited for the first one to end and then ended concurrently.
Please, help me resolve the issue. If any information that is needed to help me resolve this problem is missing, please notify and I will add it.
Apache can surely handle multiple requests at the same time, there is something surely going wrong within your apache configuration.
It depends on which version of Apache you are using and how it is configured, but a common default configuration uses multiple workers with multiple threads to handle simultaneous requests. See http://httpd.apache.org/docs/2.2/mod/worker.html for a rundown of how this works.
The reason why you are facing it is:
There is some lock somewhere - which can happen, for instance, if the two requests come from the same client, and you are using file-based sessions in PHP : while a script is being executed, the session is "locked", which means the server/client will have to wait until the first request is finished (and the file unlocked) to be able to use the file to open the session for the second user.
The requests come from the same client AND the same browser; most browsers will queue the requests in this case, even when there is nothing server-side producing this behaviour.
Probably becouse of sessions locking. When you don't need to edit session variables, close it.
http://php.net/manual/en/function.session-write-close.php
manipulate yours sessions, write on start of script.php
// manipulate writes, and unlock session file!
session_start();
$_SESSION['admin'] = 1;
$_SESSION['user'] = 'Username';
session_write_close(); // unlock session file, to another script can access
// start your script without php session block
sleep(30);
echo $_SESSION['user'];
// another script can run without wait this script finish
Have you tried to make the simultaneous calls with different browser tabs/windows/instances ?
Apache is multithreaded, so, it definitely can handle your parallel requests. It seems you have some things to check:
Make requests with an appropriate client for testing (like apache benchmark) - Take a look at https://httpd.apache.org/docs/2.4/programs/ab.html
Check your setup on apache. There is some wrong setups that can produce strange behavior, like single request at a time. Take a look at fork and worker parameters at httpd.conf. Suggestion: use all default parameters for testing.
Apache provides a variety of multi-processing modules (Apache calls these MPMs) that dictate how client requests are handled. Basically, this allows administrators to swap out its connection handling architecture easily. These are:
mpm_prefork: This processing module spawns processes with a single thread each to handle request. Each child can handle a single
connection at a time.
mpm_worker: This module spawns processes that can each manage multiple threads. Each of these threads can handle a single
connection.Since there are more threads than processes, this also means that new connections can immediately take a free thread instead of having to wait for a free process.
mpm_event: This module is similar to the worker module in most
situations, but is optimized to handle keep-alive connections. When
using the worker MPM, a connection will hold a thread regardless of
whether a request is actively being made for as long as the
connection is kept alive.
Try including the sleep and phpinfo within the session before calling session close.
As it looks like the sessions(all the five are treated as same and are terminated with the first one being terminated). Maybe verify if the Session Ids are the same.
By keeping the session open you can see that concurrently they are handled.
I encountered a similar problem. Multiple requests kept hanging randomly while connecting to the server.
Tried changing the mpm configurations, with no use.
Finally this one seems to solve the problem for me. (from https://serverfault.com/a/680075)
AcceptFilter http none
EnableSendfile Off
EnableMMAP off
You can move session storing from files to database - than you would have possibility to request your files all at once without waiting - or - if you don't need session in your script turn it off (don't use session_start();)
I have a webpage that when users go to it, multiple (10-20) Ajax requests are instantly made to a single PHP script, which depending on the parameters in the request, returns a different report with highly aggregated data.
The problem is that a lot of the reports require heavy SQL calls to get the necessary data, and in some cases, a report can take several seconds to load.
As a result, because one client is sending multiple requests to the same PHP script, you end up seeing the reports slowly load on the page one at a time. In other words, the generating of the reports is not done in parallel, and thus causes the page to take a while to fully load.
Is there any way to get around this in PHP and make it possible for all the requests from a single client to a single PHP script to be processed in parallel so that the page and all its reports can be loaded faster?
Thank you.
As far as I know, it is possible to do multi-threading in PHP.
Have a look at pthreads extension.
What you could do is make the report generation part/function of the script to be executed in parallel. This will make sure that each function is executed in a thread of its own and will retrieve your results much sooner. Also, set the maximum number of concurrent threads <= 10 so that it doesn't become a resource hog.
Here is a basic tutorial to get you started with pthreads.
And a few more examples which could be of help (Notably the SQLWorker example in your case)
Server setup
This is more of a server configuration issue and depends on how PHP is installed on your system: If you use php-fpm you have to increase the pm.max_children option. If you use PHP via (F)CGI you have to configure the webserver itself to use more children.
Database
You also have to make sure that your database server allows that many concurrent processes to run. It won’t do any good if you have enough PHP processes running but half of them have to wait for the database to notice them.
In MySQL, for example, the setting for that is max_connections.
Browser limitations
Another problem you’re facing is that browsers won’t do 10-20 parallel requests to the same hosts. It depends on the browser, but to my knowledge modern browsers will only open 2-6 connections to the same host (domain) simultaneously. So any more requests will just get queued, regardless of server configuration.
Alternatives
If you use MySQL, you could try to merge all your calls into one request and use parallel SQL queries using mysqli::poll().
If that’s not possible you could try calling child processes or forking within your PHP script.
Of course PHP can execute multiple requests in parallel, if it uses a Web Server like Apache or Nginx. PHP dev server is single threaded, but this should ony be used for dev anyway. If you are using php's file sessions however, access to the session is serialized. I.e. only one script can have the session file open at any time. Solution: Fetch information from the session at script start, then close the session.
I am attempting to scrape a large amount of data from a website. (Probably about 50M records.) The website uses $_GET so it is simply a matter of generating a list of links, each one of which collects a bit of the data.
I have one script that generates a list of links on the screen. The links all call the same PHP script, passing a different search value. I then use the Chrome "LinkClump" extension to start all the links in separate tabs simultaneously (Right-click and drag across all links).
I start 26 tabs at once but the called PHP scripts do not all start. A write to log shows that only 6 ever run at once. The next one will not start until one of the others has finished. Is there any way to get more than 6 running at once?
Here is the relevant snippet of code in the 26 worker scripts that does the search. I simply pass a different $value to each one:
$html = file_get_html("http://website.com/cgi-bin/Search?search=$value");
foreach($html->find('table[cellpadding="3"]') as $e)
foreach($e->find('tr') as $f){
$colval=0;
foreach($f->find('td[class="output"]') as $g)
To check whether it was Apache or simple_html_dom that was throttling the connections I wrote another tiny script that simply did a sleep(10) with a write to log before and after. Once again only 6 would execute at once, hence it must be Apache.
Is there some ini setting that I can change in my script to force more to run at once please?
I noticed this comment in another posting at Simultaneous Requests to PHP Script:
"If the requests come from the same client AND the same browser most browsers will queue the requests in this case, even when there is nothing server-side producing this behaviour."
I am running on Chrome.
Browsers typically limit the number of concurrent connections to a single domain. Each successive tab opened after this limit has been reached will have to wait until an earlier one has completed.
A common trick to bypass this behaviour is to spread the resources over several subdomains. So, currently you're sending all your requests to website.com. Change your code to send six requests each to, say, sub1.website.com, sub2.website.com, etc. You'll need to set these up on your DNS and web server, obviously. Provided your PHP script exists on each subdomain you should be able to run more connections concurrently.
I found the answer here: Max parallel http connections in a browser?
It is a browser issue. It indicates that Firefox allows the limit to be increased so I will try that.
For the benefit of others, here is what you have to do to allow Firefox to have more than 6 sessions with the one host. It is slightly different from the above post.
1. Enter about:config
2. Accept the warranty warning
3. Find network.http.max-persistent-connections-per-server and change it from 6 to whatever value you need.
You can now run more scripts on that host from separate tabs.
If this is useful information please up-vote question. I need to get rid of negative reputation.
After a few hours, an Apache server slows to a crawl, and upon investigating, I can narrow it down to one website (each website has a separate user).
This website has multiple php5.cgi processes running, while a normal site has 1 or 2, and is killing the server.
My guess is that a script or plugin is causing these problems, is it possible to determine which script spawned/created a process?
Thank you
If PHP is really running as a CGI, you can read the request variables for a php5.cgi process by looking at the process's environment variables in /proc/<pid>/environ.
There is no equivalent way to do this for FastCGI processes, though.