linux server file download connection limit - php

I'm running a dedicated server...
Centos 6
PHP 5.3.3
Nginx 1.0.15
Nginx uses fastcgi to run php.
The server communicates with another server using remote sql.
A file called download.php initiates a mysql connection, checks some details in the database and then begins streaming bytes to the user with content-displacement.
No matter what I do, I cannot get simultaneous connections to download a file above 5. For instance if I download a file using a file manager, a maximum of 5 connections can be made, the rest timeout.
I've setup nginx to accept up to 32 connections, mysql connection is closed before the file begins to stream so there shouldn't be connection limit issues there.
Does anybody have any idea how I can increase the amount of connections?
Perhaps an idea of what else I can check?
Thanks.

Edit /etc/init.d/php_cgi
set server_childs=32
Problem solved!

Most likely the server is set to restrict by ip address. See http://www.nakedmcse.com/Home/tabid/39/forumid/14/postid/61/scope/posts/Default.aspx for more info.

Related

err_connection_timed_out and PHP Warning: mysqli_connect():(HY000/2002):Can't connect to local MySQL server through socket '/var/lib/mysql/mysql.sock'

Recently, from few weeks ago I got err_connection_timed_out just sometimes while working with my website.
and i am sure, my users get this error too.
When i see at error log, there are a few of message like below
PHP Warning:mysqli_connect(): (HY000/2002): Can't connect to local MySQL server through socket '/var/lib/mysql/mysql.sock
I changed 'localhost' to '127.0.0.1' for TCP connection instead of Unix socket, but no success.
My server host admin do not accept any problem at server side (like always!).
I asked them to restart Mysql, but they do not accept because it is a share server.
It is a Directadmin server and they recommend switch to Cpanel as the last chance.
Do you have any experience and solution for this problem?
EDIT: The main problem is err_connection_timed_out which i get in browser and i don't know if it relates to Mysql connection error in log file.
Switching might have positive effects because it's differently setup.
But if it happens sometimes it is probably the MYSQL server is burping.
Lately it's normal that the services has limit per user; so you can try to enable cache (of that is possible); to see whether it's less frequent (might be some queries that is running long).
do you have any cronjobs running or other processes around the time you receive those errors?
Normally this problem is caused due to MySQL server timing-out database connections is faster than the Stash connection pool can notice. This exhausts the Stash connection pool as it will keep its connections open while the MySQL side of the connections is already closed.
Easiest solution.
You can fix it by simply change this parameter (wait_timeout) at the Mysql config.
More difficult but better
Also you can close all connections in your code after you got the values. And finally you can change your connection to your database in order to reuse an existing connections.
And last but not least
Try a vps you got out there too many oferts, even for free. You wouldn't be able to change anything in a shared host sorry. :(

Persistant TCP Connection To Speed Up Remote File Fetching

I have a PHP script that uses file_get_contents to fetch a file on a remote server on every page load. Is it possible to make a persistent connection between the two servers to speed up the time it takes to fetch this file?
Your PHP process is likely ending each request, so you will have to handle this outside of the main PHP process.
I would recommend setting up Nginx as a proxy, and pointing your PHP script at Nginx. You can then configure Nginx to use HTTP/1.1 keep-alive, which will keep a persistent connection open if requests are coming through regularly.

Starting a remote MySQL server using php

I currently use a VPS hosting MySQL to help reduce the load of my main server. Occasionally the SQL server conks out due to the amount of traffic it receives.
I'd like a small PHP script that will start the remote server. I already have a way of detecting when the SQL server isn't available, I just need a way of executing /etc/init.d/mysql start.
Have a look at the rexec command

Advantage and PHP - Error 6303: Maximum Advantage Database Server connections exceeded

My company php web site is connected to an Advantage Database Server where are stored all necessary data such as users, passwords and customer registry.
Lately we started to get an error requesting web pages:
Warning: SQL error: [Extended Systems][Advantage SQL][ASA] Error 6303: Maximum Advantage Database Server connections exceeded. axServerConnect, SQL state HY000 in SQLDriverConnect in C:\...\www\... on line...
It's becoming critical day by day and it can happen once a week or twice a day without an apparent reason.
When website crashes, database service still working great with other applications connected and the only way to restore web service is to restart apache web server.
On database server we've got ads.ini configuration file in C:\Windows folder where we raised max connections setting with "MAX_CONNECTIONS=1000" which is really big compared to our needs.
Can it be useful if we set also "RETRY_ADS_CONNECTS = 1" ??
I found this post where R&D confirms a bug in may 2009:
Is this a bug with Advantage Database?
Has this been fixed? In wich release?
Where can i see the real number of connections open by apache on db?
Each php page closes ads connection on footer, what can cause connections to exceed??
Thanks in advance for help.
-
ENVIRONMENT INFO
Database:
Advantage Database Server 10.10.0.6 on Windows 2003 server
Web server:
Apache/2.0.59 (Win32) mod_ssl/2.0.59 OpenSSL/0.9.8d PHP/4.4.7 on Windows XP pro
On phpinfo() page we get "Advantage Version" "8.00.0.0".
Why this? Do we need to upgrade php-advantage extension?
Lots of questions, but I will try and address each.
1) 6303 Error. Using MAX_CONNECTIONS is the correct way to resolve this.
Make sure MAX_CONNECTIONS is in the [SETTINGS] section
Check if Apache / PHP / ADS driver is using the correct ads.ini file. You can use Process Monitor from Sysinternals to see what ads.ini file was opened successfully. If you upgrade your PHP driver you can set an environment variable adsini_path to point to the directory where the ads.ini lives.
2) Setting RETRY_ADS_CONNECTS=1 will be helpful. This also goes under the [SETTINGS] section of the ads.ini. When an ADS client receives a networking error (generally a 6000 class error) then the error is cached by the client driver and subsequent attempts to connect will use the cached error vs. retrying. Setting RETRY_ADS_CONNECTS will tell the ADS client to ignore the cached error and retry the connection
3) Bug: Looks like this was fixed in 9.10.0.9 version of the driver based on the release notes on http://devzone.advantagedatabase.com.
Fixed an issue where the garbage collection reference count on a
connection would be incorrect if multiple SQL statements were opened
on it.
Since you are running a 10.1 server you may look at updating to a 10.1 client which will also contain the fix. 10.1 Advantage PHP Driver
4)See the real number of connection
I would recommend using the stored procedure sp_mgGetConnectedUsers, you can use ARC (Advantage Data Architect) but it can be difficult to group, order, etc.
Since you are using 10.1 you can include the results of the stored procedure in a query such as
SELECT COUNT(*) FROM (EXECUTE PROCEDURE sp_mgGetConnectedUsers()) u WHERE ADDRESS='xxx.xxx.xxx.xxx'
You could also use other fields to identify the PHP application such as UserName (the server name), DictionaryUser (assuming the php application uses a unique user), ApplicationID
5) PHPINFO shows Advantage Version of the Advantage PHP client driver. You may want to upgrade the client driver for reasons mentioned above. Should be as simple as swapping DLL files (ace32.dll, axces32.dll, adsodbc.dll and php_advantage.dll), but I would recommend testing first to ensure you get everything.

AJAX requests hang when served in quick succession

On my laptop I have an app that makes 7 AJAX GET requests to a single PHP script at about the same time (millisecond difference). They all return successfully with the result I want.
Then I moved this script to a server (Windows Server) running Apache and PHP. However, this process hangs when I make the same 7 AJAX requests. However, if I make each request individually then they all come back successful! Something doesn't want me to do all 7.
Why is this happening? What configuration variables in the PHP.ini and httpd.conf can I look for to determine what this is?
Thanks
I think the problem might be on the browser-side.
Most browsers have a 2 concurrent connections limit when talking to the same server.
When you moved your application to the server, the extra latency might have overlapped your AJAX requests, which on localhost were being served in quick succession.
You may want to check out these related articles:
The Dreaded 2 Connection Limit
The Two HTTP Connection Limit Issue
Circumventing browser connection limits for fun and profit
The server may have a throttler in place to keep excessive requests from coming in too quickly.
Maybe your Apache configuration limits the number of concurrent connections from the same IP, or even Windows. What version of Windows is it? What kind of Apache installation, Standalone or as a part of XAMPP?

Categories