I'm a beginner in PHP and making a website that displays basic system information(CPU usage, memory usage etc)of a linux system on a webpage.For the web server, i used the built-in web server:
php -S 192.168.1.36:8000
The frontend uses Bootstrap and JS. The php script i'm using uses Server-Sent Events(learnt about it from here) to send CPU usage, memory usage and disk usage(it gets those from the shell_exec() method) to the front-end approximately once every 2 seconds.
The problem is, the site is very slow to refresh, and occasionally, very slow to load the first time too.
When i looked at the JS console, i noticed that the server was streaming data to the webpage even after i pressed the refresh button.Could it be because the connection hasn't been closed??
The build-in web-server for PHP is for development usage. It is single threaded and when you use it, it just takes the hundredfold of time to initialize an incoming request. And mostly you dont have not only one request but also requests for js, css and images. So it can takes a few seconds to load a full page. It's still not implemented for perfomance.
For a simple test or a short development cycle it is okay to use it, but for intensive development I always prefer and recommend a real webserver.
Related
I have a webpage that when users go to it, multiple (10-20) Ajax requests are instantly made to a single PHP script, which depending on the parameters in the request, returns a different report with highly aggregated data.
The problem is that a lot of the reports require heavy SQL calls to get the necessary data, and in some cases, a report can take several seconds to load.
As a result, because one client is sending multiple requests to the same PHP script, you end up seeing the reports slowly load on the page one at a time. In other words, the generating of the reports is not done in parallel, and thus causes the page to take a while to fully load.
Is there any way to get around this in PHP and make it possible for all the requests from a single client to a single PHP script to be processed in parallel so that the page and all its reports can be loaded faster?
Thank you.
As far as I know, it is possible to do multi-threading in PHP.
Have a look at pthreads extension.
What you could do is make the report generation part/function of the script to be executed in parallel. This will make sure that each function is executed in a thread of its own and will retrieve your results much sooner. Also, set the maximum number of concurrent threads <= 10 so that it doesn't become a resource hog.
Here is a basic tutorial to get you started with pthreads.
And a few more examples which could be of help (Notably the SQLWorker example in your case)
Server setup
This is more of a server configuration issue and depends on how PHP is installed on your system: If you use php-fpm you have to increase the pm.max_children option. If you use PHP via (F)CGI you have to configure the webserver itself to use more children.
Database
You also have to make sure that your database server allows that many concurrent processes to run. It won’t do any good if you have enough PHP processes running but half of them have to wait for the database to notice them.
In MySQL, for example, the setting for that is max_connections.
Browser limitations
Another problem you’re facing is that browsers won’t do 10-20 parallel requests to the same hosts. It depends on the browser, but to my knowledge modern browsers will only open 2-6 connections to the same host (domain) simultaneously. So any more requests will just get queued, regardless of server configuration.
Alternatives
If you use MySQL, you could try to merge all your calls into one request and use parallel SQL queries using mysqli::poll().
If that’s not possible you could try calling child processes or forking within your PHP script.
Of course PHP can execute multiple requests in parallel, if it uses a Web Server like Apache or Nginx. PHP dev server is single threaded, but this should ony be used for dev anyway. If you are using php's file sessions however, access to the session is serialized. I.e. only one script can have the session file open at any time. Solution: Fetch information from the session at script start, then close the session.
I am developing a web app that needs to call functions from a web service using SOAP.
I have coded a simple PHP backend to access the SOAP functions. The backend mainly consists of classes with static "helper" methods (I use __autoload() to load the classes only when needed).
At some point in the app, I need to call two SOAP functions at the same time. Since I want to speed up the process, I use two simultaneous ajax calls.
The thing is, those calls don't always run simultaneously. The requests are sent at the same time, but the second call lasts longer than the first one (they should have roughly the same run time). If the first call lasts 600 ms, the second one lasts 1400-1600 ms.
I have setup a simple test, where I call both functions first in parallel and then, when both requests are complete, in serial one after the other.
I have tried profiling the PHP side (both using microtime() and xdebug), but I've consistently found that the calls have the same execution time on PHP (about 500ms).
What is REALLY driving me insane, though, is the fact that the very same code seems to work and break in an apparently random fashion. Sometimes I reload the page and the calls are executed simultaneously. Then I reload again, and the second call takes longer, once more.
PHP Session lock should not be an issue, I open the session only when needed and close it with session_write_close() afterwards.
The SOAP server called by the php backend supports multiple connections, and besides my microtime() profiling has shown me that it takes about 400ms to complete a SOAP request, so I don't think the web service is to blame.
On the client side, the requests are sent at the same time. I tried adding a slight delay to the second call, to no avail.
But then again, this "random" behaviour is baffling me.
I have tested the app both on local (WAMP) and on two different servers (shared hosting). In all of the environments, PHP 5.5 was used.
What could be causing this discrepancy, and how could one try to isolate the problem, if the behaviour is not 100% consistent?
I've solved the mystery.
In my local environment, once I disabled xdebug, the ajax calls ran flawlessly in parallel.
In remote, I still had the same problem (and no xdebug at all).
So first of all, I've tried using one single AJAX call to a PHP script that used curl_multi_exec() to run the SOAP requests in parallel, basically a similar setup but mainly server-side.
I found that I had the same discrepancies in the timings using this technique as well, with a very similar pattern.
I've contacted the hosting provider explaining my problem, and they told me that it might be due to the shared environment having limited resources depending on the overall load.
So in the end while it's possible to achieve this behaviour, in my specific case I had to change my strategy and use two serial AJAX calls, updating the view after the first call to give the impression of a faster response while waiting for the second call to complete.
I have a Drupal 7 site with some none Drupal web services that use some Drupal API.
I need to hold a global array with some values that I need to update each minute, that is available to each call to the web service.
I'm new to Drupal and PHP and I'm wondering if should I use pure PHP like:
while(true){
doSomething();
sleep(60);
}
or Drupal cron or something else?
Yes, you should use Drupal's Cron. There's a link to a comprehensive setup video for Drupal's Cron in the link you provided. Using sleep() in an infinite loop is a bad idea because if you are on a shared hosting server, such as GoDaddy, the number of concurrent processes that can be running is limited. So if 20 users are sending requests to your server and 20 PHP processes are sleeping it can cause your server to crash (i.e. HTTP 50x error).
With Cron, you can just save the data you need in a file that is updated by Cron and access the file concurrently (multiple PHP processes) within your PHP script.
I've written some JS scripts on my school's VLE.
It uses the UWA Widget Format and to communicate with a locally-hosted PHP script, it uses a proxy and AJAX requests.
Recently we've moved the aforementioned locally-hosted server from a horrible XP-based WAMP server to a virtual Server 2008 distribution running IIS and FastCGI PHP.
Since then - or maybe it was before and I just didn't notice - my AJAX calls are starting to take in excess of 1 second to run.
I've run the associated PHP script's queries on PHPMyAdmin and, for example, the associated getCategories SQL takes 0.00023s to run so I don't think the problem lies there.
I've pinged the server and it consistently returns <1ms as it should for a local network server on a relatively small scale network. The VLE is on this same network.
My question is this: what steps can I take to determine where the "bottleneck" might be?
First of all, test how long your script is actually running:
Simplest way to profile a PHP script
Secondly, you should check the disk activity on the server. If it is running too many FastCGI processes for the amount of available RAM, it will swap and it will be very slow. If the disk activity is very high, then you know you've found your culprit. Solve it by reducing the maximum number of fastcgi processes or by increasing the amount of server RAM.
I've got a small php web app I put together to automate some manual processes that were tedious and time consuming. The app is pretty much a GUI that ssh's out and "installs" software to target machines based off of atomic change #'s from source control (perforce if it matters). The app currently kicks off each installation in a new popup window. So, say I'm installing software to 10 different machines, I get 10 different pop ups. This is getting to be too much. What are my options for kicking these processes off and displaying the results back on one page?
I was thinking I could have one popup that dynamically created divs for every installation I was kicking off, and do an ajax call for each one then display the output for each install in the corresponding div. The only problem is, I don't know how I can kick these processes off in parallel. It'll take way too long if I have to wait for each one to go out, do it's thing, and spit the results back. I'm using jQuery if it helps, but I'm looking mainly for high level architecture ideas atm. Code examples are welcome, but psuedo code is just fine.
I don't know how advanced you are or even if you have root access to your server which would be required, but this is one possible way.. it uses several different technologies, and would probably be suited for a large scale application rather than a small. But I'll advise you on it anyway.
Following technologies/stacks are used (in addition to PHP as you mentioned):
WebSockets (on top of node.js)
JSON-RPC Server (within node.js)
Gearman
What you would do, is from your client (so via JavaScript), when the page loads, a connection is made to node.js via WebSockets ) you can use something like socket.io for this).
Then when you decide that you want to do a task, (which might take a long time...) you send a request to your server, this might be some JSON encoded raw body, or it might just be a simple GET /do/something. What is important is what happens next.
On your server, when the job is received, you kick off a new job to Gearman, by adding a Task to your server. This then processes your task, and it will be a non blocking request, so you can respond immediately back to the client who made the request saying "hey we are processing your job".
Then, your server with all of your Gearman workers, receives the job, and starts processing it. This might take 5 minutes lets say for arguments sake. Once it has finished, the worker then makes a JSON encoded message which it sends to your node.js server which receives it via JSON-RPC.
After it grabs the message, it can then emit the event to any connections which need to know about it via websockets.
I needed something like this for a project once and managed to learn the basics of node.js in a day (having already a strong JS background). The second day I was complete with a full push/pull messaging job notification platform.