When i´m doing a request to a "huge" page with a lot of data to load, and make a second request to a "normal" content page, the normal page is blocked until the "huge" is loaded.
I activated the Profiler and recognized that the FirewallListener was the blocking element).
Profiler Screenshots (Loaded huge, switched tab - loaded normal)
Huge
Normal
While the "huge" page was loaded, i did a mysql php request on cli with some time measurements:
Connection took 9.9890232086182 ms
Query took 3.3938884735107 ms
So that is not blocking.
Any ideas on how to solve that?
Setup:
php-fpm7.2
nginx
symfony3.4
It is been blocked by the PHP Session.
You can't serve to pages that requires access to the same session id.
Although once you close/serve/release the session on the slow page, another page can be served on the same session. On the slow page just call Session::save() as soon as possible on your controller. This will release the session. Take into consideration that everything you do after saving the session will not be stored in the session.
The reason the firewall takes so long is that of debug is enabled.
In debug, the listeners are all wrapped with debugging listeners. All the information in the Firewall is being profiled and logged.
Try to run the same request with Symfony debug disabled.
We had a similar problem. When sending a couple of consecutive requests to the server in a short period, the server became very slow. I enabled the profiler bar, and a lot of time was spent by the ContextListener
The problem was that file server access on our server is very slow, and session information was stored on the file system, as is the default for symfony.
I configured my app to use the PdoSessionHandler, and the problem was gone.
Related
Hoping someone has some ideas around what to do with this.
We have a application thats PHP based hosted in IIS.
There are a number of functions that need to run which can be running for 10mins+. The problem I have is that if I run one of these functions in my web browser. If I open another tab and try to access the site while that is happening then it just sits loading until the long process finishes and then it loads the page.
I guess this is more of a multi session thing to my browser? Is there some easy option in IIS I can change that will let it load the other pages as normal? Or is this a browser thing?
It seems if I open an in private window at the same time, that will load normally.
The issue is related to the session. by default PHP session use a file system.so that has to wait for the session file to be closed before they can open new. Therefore, subsequent requests for the same session wait on prior requests.
To resolve the issue you could try the below things:
-close the session when you are done with it by using the session_write_close()
-Implement a custom DB handler which utilizes the database instead of the file system.
Reference:
FastCGI on IIS7... multiple concurrent requests from same user session?
My problem is that my Symfony application running on a remote machine with Apache/2.4.6 (CentOS) PHP/5.6.31 MySQL 5.7.19 does not handle concurrent requests. Meaning that when asking for two different pages at the same time. First one has to finish before second one can be rendered.
I have a another site on the same server written in plain Php which has no problem rendering as many pages as possible at the same time (it uses deprecated mysql connection not pdo like Doctrine).
That said I have done the following test:
I have inserted a sleep(3); at my DefaultController. I requested that page and simultaneously requested a different one. See the two profilers below:
Page with sleep (called 1st):
Page without sleep (called 2nd).
Page 1 normal load time is 782ms
Page 2 normal load time is 108ms
As you can see Symfony's Http Firewall is taking all of the time of the second page to load.
My guess (might be stupid) is that 1st action holds the database connection and only until it has finished with it does it let it go for other requests to use it. And especially something having to do with Doctrine's use of PDO connection.
By the way I have already read help and articles like:
- What is the Symfony firewall doing that takes so long?
- Why is constructing PDO connection slow?
- https://www.drupal.org/node/1064342
P.S. I have tried using both app.php and app_dev.php in apache configs nothing changed. Sticked to app_dev.php so I can have the profiler. And local development using Symfony's build in server has the same result
You can not have 2 concurrent requests for the same open session in PHP. When you use a firewall, Symfony locks the user session and until you release it manually or the request is served.
To release the session lock use this:
$session->save();
Note there will be some drawbacks and iimplications. After you save the session, you will not be able to update it (change attributes) until the next request arrives.
Session management: https://symfony.com/doc/current/components/http_foundation/sessions.html
Session interface: http://api.symfony.com/4.0/Symfony/Component/HttpFoundation/Session/SessionInterface.html#method_save
Note 2. If you have multiple concurrent users with different sessions PHP will serve concurrently those requests.
I have an application in which a module lists out some tour packages from various third party vendors through SOAP calls. This takes about 90 seconds to load.
Once one of the package is clicked, another webservice is called to get the details of the package.
Now once you click browser back from here, it is supposed to show the list without calling the webservices again (from cache). It happens so in dev machine which is http. But on production server the back button refreshes the list page and I have to wait again for about 90 seconds which is pretty painful.
Is it because of HTTPS? How do i force navigate back without refreshing the previous page?
The application in written in PHP, The production server is redhat while the dev is windows (if it helps).
It has nothing to do with HTTPS. Cache is only controlled by the Expires/Cache-Control HTTP headers, which work the same regardless of whether the connection is encrypted or not. Most likely your production server is enforcing different cache headers than your development box.
Having said that, you should also employ server-side caching for such an expensive operation. Perhaps have the data be refreshed periodically by a cron-job or such and save them on the server for fast retrieval. Any page that requires 90 seconds of work to be displayed needs to be rethought.
it is a php document that only has 2 variables that are hard-coded. It is a simple splash website that lets customers read about a product and lets them click to the actual product page. When I send 2000-4000 clicks within a 4-5 hour period the page doesn't load all the way because the high load! I do not have SSH access, only FTP. Anything I can do here?
Yeah if you are using sessions and the default time out is left as is, it will fill up your memory on the server and slow right down, set your session timeout value to a low number.
Which can be done with the info from here:
How to change the session timeout in PHP?
There could be two possibilities:
a) since you have not SSH access I'll assume you're on a shared server, and if that's the case then 2000+ page loads for your site may be too much. You'd have to upgrade your hosting plan.
b) Too many session files for the server to handle. the server may be creating many sessions files as time passes and it keeps receiving more and more requests. To delete the sessions file you'd need SSH access or have the server admin do it for you.
If your server can't deal with 500+ requests an hour then you'd be better off just getting a better hosting plan.
Servers are not limitless, and normally, even higher grade private, self-managed machines, tend to be somewhat slow.
I'm not quite sure how to phrase this question correctly, so I'll start with the scenario I encountered.
I have a bit of processing in my web app that takes longer than I'd like the user to wait to have control of the page again, so I decided to have it processed by an ajax request.
The problem is, even though I offloaded this request into an ajax request, it seems that apache won't process any further requests until the original processor heavy request is complete.
I originally wanted to know how I could get around this issue, but have since decided that it might be a bad idea in general.
However, I'm still curious if anyone knows why apache behaves this way, and what (if any) configuration directive controls it. My original thought was KeepAlive, but disabling that didn't seem to change the behavior.
I am running php through mod_php if that makes a difference.
I appreciate any help getting pointed in the right direction!
Are you using file-based sessions? PHP will lock session files for each request and maintain that lock until you do a session_write_close() or the script terminates/exits. The side effect of this is that all requests become serial, since they're all contending for the same single resource (the session file).
I am sure it's the session file. I have the same problem. I run a request that is long such as a PHPMyAdmin SQL insert which takes multiple minutes to process. While it is processing I try to open a new tab in the same browser and go to any page on my website and it will not go there until the original PHPMyAdmin request is done.
If I open an incognito window in Chrome which is the same browser it works fine. If I open the website in any other browser it is fine.
So it is probably the file based session which is the default for PHP.
Others have mentioned going to memcached. You could also save sessions in the database.
Before having to go to memcached you could do all the session based stuff at the beginning. Copy the session variable into a temporary variable so you can close it then close it. And then if you need to set a session value later open it and make the change and then close it quickly.
Can you point to evidence that it's apache? Unless your apache setup isn't optimal, most likely your page wait is something else, maybe you have set your ajax call to be non-async?