On my laptop I have an app that makes 7 AJAX GET requests to a single PHP script at about the same time (millisecond difference). They all return successfully with the result I want.
Then I moved this script to a server (Windows Server) running Apache and PHP. However, this process hangs when I make the same 7 AJAX requests. However, if I make each request individually then they all come back successful! Something doesn't want me to do all 7.
Why is this happening? What configuration variables in the PHP.ini and httpd.conf can I look for to determine what this is?
Thanks
I think the problem might be on the browser-side.
Most browsers have a 2 concurrent connections limit when talking to the same server.
When you moved your application to the server, the extra latency might have overlapped your AJAX requests, which on localhost were being served in quick succession.
You may want to check out these related articles:
The Dreaded 2 Connection Limit
The Two HTTP Connection Limit Issue
Circumventing browser connection limits for fun and profit
The server may have a throttler in place to keep excessive requests from coming in too quickly.
Maybe your Apache configuration limits the number of concurrent connections from the same IP, or even Windows. What version of Windows is it? What kind of Apache installation, Standalone or as a part of XAMPP?
Related
I have setup a local server on a regular desktop (not a server desktop) and have 3-4 client machines accessing the local web application I developed from the server via a WIFI router (server is connected to router via cable. All clients via WIFI).
When two of the clients are connected to the application all is well, but when a third (or more) machine joins in there are periods where each machine does not get any service from the server (the application webpage remains loading until I manually reset Apache on the server via services). At times the server responds when one of the clients refresh their page but most of the time I have to perform a reset of the Apache server.
This occurs roughly once an hour on average (6 hours of continuous usage) as the clients are using the application.
Server is running Windows 7 and Apache v2.4 with PHP v5.4
Server and all client machines are running AVG internet security
Firewall is handled by AVG Internet Security
Is this issue due to the code in my application, desktop not being able to manage requests like a server machine, antivirus or a mix of the three?
If so, how can I set-up the server to reset automatically?
Thanks
UPDATE
It is a application where users write reports on files after reviewing information
-Frequent sql requests for file data
-No images
-Some pages contain multiple sql queries that represent the page content
-Network has no internet connection
-Code does not make requests for external information from the internet
-All client machines run the application on Google Chrome web browser
But it rarely happens but sometimes the amount of connection is restricted by the third-party interface being used by the application. We are unable to figure out the reason unless having more details like what content of your app, and the error code apache or HTTP returning.
This kind of situations is difficult to track, especially on Windows where diagnostic tools aren't as readily available as on other platforms.
I suppose you can try and check the antivirus by either running server and clients with no antivirus at all for some hours, or disabling and re-enabling the antivirus when the hangup occurs.
Apart from that, you would need to pinpoint where the error occurs:
in the connection stage (Windows OS is the problem)
in the response stage (Apache is the problem - try fiddling with the child spawning parameters)
in the management stage (PHP is the problem - you can probably check this by changing the setup from PHP-as-a-module, and PHP-as-CGI-application)
in the response stage (that is, connection to the SQL server). You can check this by setting up some pages that use different combinations of session, database, and output buffering and see whether those pages remain reachable even when the application is hung up.
For an example of the last, if a page such as
<?php print date("H:i:s"); phpinfo(); ?>
remains reachable and correctly refreshes (that's why the date() command) even when the app does not respond, this demonstrates that Windows, Apache and PHP are "innocent", and either the SQL server has issues, or you do not interface with it correctly. It might be for example be the case (though unlikely in this instance) that the resident PHP module is accumulating connections to the SQL server and not releasing them, so that after a while you need to stop Apache (thereby freeing the module) and restart.
If this were the case, even if it's not a "real" fix, you can set up Apache so that all children die and are replaced after a small number of requests (once it was 150, but when leaks all but disappeared, I believe that the default became 0 -- Apache children no longer die. Check it out, I might well misremember).
I have a working PHP website at a client where I work which runs on IIS. As we are switching to MsSQL, I need to enable the php_pdo_sqlsrv_53_nts.dll. However once I'm enabling the extension, I start to receive a 500 error. My guess is that I need to restart the webserver but for certain reasons at this time we would like to avoid it.
Can you please tell me whether a restart of the web server is necessary on IIS to enable correctly a php dll?
A restart is required even if you work on your localhost !
yes - see Microsoft.com
Mind you, restarting any of my webserver takes only a few seconds so I'm not sure if that's a big issue for your client. Does he have more than one server with a load balancer or something? In that case you can do them one by one or something? Or maybe there's another smart idea of temporarily rerouting traffic elsewhere through changing the DNS?
Contrary to popular opinion, I'm going to say No, and here's why:
Since you are using IIS, you could try recycling the App Pool, if the restart is not necessarily urgent.
It might take a little while to cycle, but "recycle" uses an overlapping method, keeping the old process up until its active requests are finished while a new process handles any newly generated requests. This continues until all existing processes are finished, then the old pool gracefully exits. This will ensure that service is not disrupted for the end users. On the down side, if you have users that sit on the site for long periods of time, it may take a while before your PHP extension becomes available.
I've had success with this method in the past, was able to install PHP extensions without restarting IIS outright.
To Recycle in IIS 7:
Open Internet Information Services (IIS) Manager
Navigate to SERVERNAME > Application Pools
Select the pool you wish to recycle (the one attached to the site where you need the extension)
In the Action pane, click "Recycle..."
I'm currently developing a PHP application that is going to use websockets for client-server communication. I've heard numerous times that PHP shouldn't be used for server applications because of the lack of threading mechanisms, its memory-management (cyclic references) or the unhandy socket library.
So far, everything is working quite well. I'm using phpws as the websocket library and the Doctrine DBAL to access different database systems; PHP is version 5.3.8 . The server should serve a maximum of 30 clients. Yet especially in the last days I've read several articles stating the ineffectiveness of PHP for long running applications.
Now I'm not aware whether I should continue using websockets with PHP or rebuild the entire serverside application. I've tried Python with Socket.IO, though I did not get the results I expected.
I guess I have the following options:
Keep everything as it is.
Make the application use Ajax in combination with Socket.IO - e.g. run a serverside script that invokes the client's ajax calls when data is submitted to the server.
The last point sounds quite interesting, though it would require some work .. Would it be a problem for servers to execute all the clients requests at one time?
What would you recommend? Is the problem with PHP's memory management (I'm using gc_collect each time a client sends data to the server) still valid? Are there other reasons beside the obvious reasons (no threading, ...) for not using PHP as a server?
You can try running your socket.io on a node server on another port on your server (that is if you are not using a hosting plan like goDaddy).
I am using it and the performances are really satisfying.
I have an apache server on the port 80 serving my php files, and my server-client communications are done using a Node.js server running socket.io on the port 8080 (dev) or 843 (prod).
Node.js is really light and has great performance, but you need to run it as a server. Nodejitsu.com is a hosting solution that has the websocket protocol available and is on beta, so it is still free for now. Just note that you need to listen on the port 80 with socket.io, this is a limitation from theyr network.
If you want your pages all to be accessed on the port 80 then you will need a reverse proxy like varnish .
I hope that helps! Have a nice day.
Are there other reasons beside the obvious reasons (no threading, ...)
for not using PHP as a server?
Yep, lots of socketfunctions are incompatible with each other and it's a hell to debug.
i tried something similar myself and quit frustrated sind every function i thought would make sense didnt do what i expected
I used to have a small chat app(which was almost working), that uses PHP, jQuery and MySQL. The volume of users is very small (only my friends uses it). I used long polling method for this.
And now, I am thinking about using HTML5 Websockets for this, because it is a lot more efficient. And also most of my friends are using Google Chrome(which already supports HTML5). I have gone through some tutorials that talks about HTML5 websockets. And I have downloaded the phpWebSocket from github. I have gone through the code. But the readme file says that the PHP page that listens to incoming connections should be run using "PHP -q" from commandline. So, I have searched what this "q" flag would do. And I found that it runs the page in quiet mode. So, when I run this in quiet mode what is happened ? It would run endlessly ? Will this running process affect the system resources ?
This PHP page should run the entire time. Then only the connections could be accepted. Isn't it ?
I am having a shared hosting package with HostGator. And they allow cron jobs too. And my present chat app(that uses long polling method) inserts all the messages to database. When the user polls, it would search for any new messages from the database and then output them (if any).
So, I am bit stuck here. :(
It should be run from the command line because as you suspected, it is intended to run endlessly. It binds to a socket on the server and listens for incoming connections. It can't be reliably run from the browser.
The "-q" option tells it not to output any browser headers such as X-Powered-By: PHP or Content-Type: text/html
It will consume as much memory as PHP requires as long as its running. Your memory footprint on startup with no clients will vary between configurations. The more connected clients, the more cpu, memory and socket descriptors you will use. It uses select so it is efficient socket handling.
Also, since you're on shared hosting, you probably won't be able to use it because your user will most likely not have the ability to bind to a port and listen for connections.
As you can see in the demo, the URL to connect the WebSocket to is ws://localhost:12345/websocket/server.php. Unless you have a webserver capable of using WebSockets, you will have to run something like phpWebSocket that acts as a server and listens on a port other than 80.
Hope that helps.
The shared hosting package for HostGator does not allow clients to bind to local ports for incoming. This might be part of the problem.
http://support.hostgator.com/articles/pre-sales-policies/socket-connections
I am getting 403 Errors from Apache when I send too many, 12, synchronous HTTP Posts via a desktop app I am building in XCode / Objective-C. The 12 POST requests are just a few kb each and go out instantly one after the other and the Apache Error Log shows...
client denied by server configuration: /the-path/the-file.php
Apache 2.0 PHP 5 and I have this same setup working fine on my local machine. The error is coming from a VPS with my host, which runs very fast and smooth and has plenty of resources. To debug I threw a sleep(1); function (stalls script execution by 1 second) into the php file and that fixed it. This makes me think that I am breaking some limit for too many requests for a single IP in a certain amount of time. I have googled and combed PHP ini and Apache configs, but I cannot find what that directive/setting might be.
I should mention that the although it varies the first 4 or 5 POSTS usually work then it starts returning the 403 error intermittently after that. Just really acting like its bogging down.
Any ideas?
The error tells you everything: Most likely your VPS has flood control on their web server, which kicks in at 4 or 5 quickly-sequential hits. This has nothing to do with PHP itself, but ratherly completely to do with Apache. In other words, your home setup is not the same as the VPS's setup.
Try to off or configure mod_evasive. It is a module for Apache to provide evasive action in the event of an HTTP DoS or DDoS attack or brute force attack. (Here you can read more about it). Use the command to off mod_evasive:
a2dismod mod-evasive
service apache2 restart