Symfony2 - slow Ajax response - php

Whenever I make a simple Ajax POST request in Symfony, there seems to be a rather long waiting time before the response. Both in dev and prod environment.
I enabled XDebug to get a better insight. Here is what's going on on the server:
It seems to be Symfony's ClassLoader causing that long cycle. Page loads are similarly slow, generally over half a second.
I am running the application on localhost, on a Windows machine.
Is there any way I could reduce this execution time?

Related

Why are my concurrent AJAX requests to PHP Scripts/WP REST API so slow?

After some investigation, I updated the title of the question. Please see my updates below.
Original question:
I'm building a website with Wordpress and make sometimes use of async calls to WP REST API endpoints.
Calling this endpoints from my AJAX functions leads often to TTFB times from at least ~780ms:
But if I open the URL/endpoint directly in the browser I get TTFB times that are 4-5 time faster:
I wonder where the delays come frome. I'm running this page on my local dev server with Apache 2.4, HTTP/2 and PHP 7 enabled.
What is the best way to monitor such performance "issues"?
Please mind: I'm not using Wordpress' built-in AJAX-functionality. I'm just calling something like
axios.get(`${url}/wp-json/wp/v2/flightplan`)
inside a React component that I've mounted in my homepage-template.
Update
Damn interesting: clearing cookies reduces TTFB a lot:
Update 2
After removing the other both AJAX calls, the flightplan request performs much faster. I think there are some issues with concurrent AJAX requests. I've read a bit about sessions locking, but since Wordpress and all of the installed plugins doesn't make use of sessions, this can't be the reason.
Update 3
Definitively, it has something to do with my local server setup. Just deployed the site to a "real" webserver:
But it would still be interesting to know how to setup a server that can handle concurrent better.
Update 4
I made a little test: calling 4 dummy-requests before calling the "real" ones. The script only returns a "Foobar" string. Everything looks fine at this time:
But when adding sleep(3) to the dummy AJAX-script, all other requests takes much longer too:
Why?
Because your Ajax call will wait the loading of all your WP plugins :)
So you need to make some test without plugin, and activate one by one to see which one slow down your ajax call.

PHP built-in web server running very slow

I'm a beginner in PHP and making a website that displays basic system information(CPU usage, memory usage etc)of a linux system on a webpage.For the web server, i used the built-in web server:
php -S 192.168.1.36:8000
The frontend uses Bootstrap and JS. The php script i'm using uses Server-Sent Events(learnt about it from here) to send CPU usage, memory usage and disk usage(it gets those from the shell_exec() method) to the front-end approximately once every 2 seconds.
The problem is, the site is very slow to refresh, and occasionally, very slow to load the first time too.
When i looked at the JS console, i noticed that the server was streaming data to the webpage even after i pressed the refresh button.Could it be because the connection hasn't been closed??
The build-in web-server for PHP is for development usage. It is single threaded and when you use it, it just takes the hundredfold of time to initialize an incoming request. And mostly you dont have not only one request but also requests for js, css and images. So it can takes a few seconds to load a full page. It's still not implemented for perfomance.
For a simple test or a short development cycle it is okay to use it, but for intensive development I always prefer and recommend a real webserver.

How to troubleshoot ajax calls not being executed simultaneously in PHP backend?

I am developing a web app that needs to call functions from a web service using SOAP.
I have coded a simple PHP backend to access the SOAP functions. The backend mainly consists of classes with static "helper" methods (I use __autoload() to load the classes only when needed).
At some point in the app, I need to call two SOAP functions at the same time. Since I want to speed up the process, I use two simultaneous ajax calls.
The thing is, those calls don't always run simultaneously. The requests are sent at the same time, but the second call lasts longer than the first one (they should have roughly the same run time). If the first call lasts 600 ms, the second one lasts 1400-1600 ms.
I have setup a simple test, where I call both functions first in parallel and then, when both requests are complete, in serial one after the other.
I have tried profiling the PHP side (both using microtime() and xdebug), but I've consistently found that the calls have the same execution time on PHP (about 500ms).
What is REALLY driving me insane, though, is the fact that the very same code seems to work and break in an apparently random fashion. Sometimes I reload the page and the calls are executed simultaneously. Then I reload again, and the second call takes longer, once more.
PHP Session lock should not be an issue, I open the session only when needed and close it with session_write_close() afterwards.
The SOAP server called by the php backend supports multiple connections, and besides my microtime() profiling has shown me that it takes about 400ms to complete a SOAP request, so I don't think the web service is to blame.
On the client side, the requests are sent at the same time. I tried adding a slight delay to the second call, to no avail.
But then again, this "random" behaviour is baffling me.
I have tested the app both on local (WAMP) and on two different servers (shared hosting). In all of the environments, PHP 5.5 was used.
What could be causing this discrepancy, and how could one try to isolate the problem, if the behaviour is not 100% consistent?
I've solved the mystery.
In my local environment, once I disabled xdebug, the ajax calls ran flawlessly in parallel.
In remote, I still had the same problem (and no xdebug at all).
So first of all, I've tried using one single AJAX call to a PHP script that used curl_multi_exec() to run the SOAP requests in parallel, basically a similar setup but mainly server-side.
I found that I had the same discrepancies in the timings using this technique as well, with a very similar pattern.
I've contacted the hosting provider explaining my problem, and they told me that it might be due to the shared environment having limited resources depending on the overall load.
So in the end while it's possible to achieve this behaviour, in my specific case I had to change my strategy and use two serial AJAX calls, updating the view after the first call to give the impression of a faster response while waiting for the second call to complete.

nginx php-fpm cache same multiple requests per second

I'm having the following situation:
An auction website where all users connected make a ajax request to the server every 2 seconds.
The data changes every 2 seconds so it cannot be cached for a long duration so I was wondering.
What would be the best way to accomplish this:
If I have 200 request in the same second, serve them the same response instead of running again php and connecting to mysql to get results.
So I don't know if this could be done with such a small duration of cache of 1 second, also I don't know what would be better to use, something on Nginx side, or something on PHP side such as APC.
Any ideas? does it make sense?
My problem is that I've tried to tweak Nginx and php-fpm and right now it can handle 200 requests/s at 2000ms response time, at 500requests/s is about 5000ms so I'm looking for a way to speed things up and handle as much requests per second as possible.
Update:
The website is running on Symfony2 so any suggestions related to it are also welcome.
Update 2!!!
I have moved the part of the application that handled the ajax request into a single php file without using the Symfony2 framework. It does 3 sql queries and returns json response. Now it can handle 1000+ requests at 150ms/second, it's really incredible.. I guess Symfony2 really needs tweaks to be able to do the same and I guess the problem was not php but all the memory used by the framework.
Vanilla PHP is of course faster than any PHP framework but maintaining dozens of such scripts is painful. You can stick with Symfony and use Varnish to handle the heavy load. Cache TTL can be as low as 1 second and with Varnish you can handle thousands of requests.

Artisan development server quits after every request

I'm new to Laravel/Composer and MVC frameworks in general. I've been going through some very good tutorials but in my development environment, each time I run
php artisan serve
in terminal and then make an HTTP request, the development server process is killed. The log doesn't record any errors at the time the request is made or the process is killed. This issue seems similar and is the only other thing I could seem to find, but for me the issue is happening on a fresh install and I'm not invoking the Sentry call anywhere (it even occurs when i request a simple static html or image file). Is there something I need to do to keep the server running after a request is made?
Thanks so much!!

Categories