The server at www.localhost.com is taking too long to respond - php

A very strange thing is happening. I am running a script on a new server (it works on my current server and laptop).
The strange thing is that I only get it to (sort of) work when I increase memory limit to 1024M (!). It is extracting a large zip file and going through the files, so I thought it was normal. Instead of this script terminating or ending with errors. I get an error from my browser:
The server at www.localhost.com is
taking too long to respond.
Localhost.com? The web server is just localhost:9090 and I can see Apache is still running. Maybe Apache crashes momentarily and it can't find the server? But nothing about apache crashing in the log files.
This isn't a server issue, its more to do with my PHP script and memory usage I think, so no need to move to server fault.
What could be the problem? How can I narrow do the cause, I am at loss here!
The server is a windows server running Apache 2.2 with PHP version 5.3.2. May laptop and the other working server are running version 5.3.0 and 5.3.1 for PHP.
Thanks all for any help

Ensure that,
ini_set('display_errors','On');
ini_set('error_reporting',E_ALL);
ini_set('max_execution_time', 180);
ini_set('memory_limit','1024MB' );
I'd pop this in the top of the script and see what comes out. It should show you errors and the like.
The other thing, have you checked fopen and the path of the file which it's loading?
Abs said,
check files being zipped up can be zipped by PHP (permissions
especially on a Windows OS with multi
users)

I kept getting this problem too, and none of these sites really helped until I started looking at the same thing for people using Internet Explorer. The way I fixed it is to open up the system hosts file, located at C:\Windows\System32\drivers\etc\hosts, and then uncomment out the line that mentions ::1, which is needed for IPv6. After that it worked fine.

Somehow your system's munged up and isn't treating localhost as the local 127.0.0.1 address. Is your hosts file properly configured? This is most likely why you're getting the "too long to respond" error:
marc#panic:~$ host www.localhost.com
www.localhost.com has address 64.99.64.32
marc#panic:~$ wget www.localhost.com
--2010-08-03 22:41:05-- http://www.localhost.com/
Resolving www.localhost.com... 64.99.64.32
Connecting to www.localhost.com|64.99.64.32|:80... connected.
HTTP request sent, awaiting response... Read error (Connection reset by peer) in headers.
Retrying.
www.localhost.com is full valid hostname as far as the DNS system is concerned.

I am not a php guru by any means but are you writing the extracted files to a temporary local storage location that is within the scope of the application? Because if you are not then I think what is happening is that the application is storing the zip file and extracted files in memory and then is attempting to read them. So if it is a large zip and/or the extracted files are large that would introduce a huge amount of overhead on top of the overhead introduced by your read and processing actions.
So if you are not already I would extracted the files and write them to disk in their own folder, dispose of the zip file at this point, and then iterate over the files in your newly created directory and perform whatever actions you need to on them.

Related

PHP scripting timing out after 60 seconds

Im currently writing a php script which accesses a csv file on a remote server, processes the data then writes data to the local MySQL database. Because there is so much data to process and insert into the database (50,000 lines), the script takes longer than 60 seconds to run. The problem I have is, the script times out after 60 seconds.
To make sure its not a MySQL issue, i created another script that enters an infinite loop, and it too times out at 60 seconds.
I have tried increasing/changing the following settings on the Ubuntu server but it hasn't helped:
max_execution_time
max_input_time
mysql.connect_timeout
default_socket_timeout
the TimeOut value in the apache2.conf file.
Could it possibly be an issue because i'm accessing the PHP file from a web browser? Do web browsers have time out limits?
Any help would be appreciated.
The simplest and least intrusive way to get over this limit is to add this line to your script.
Then you are only amending the execution time for this script and not all PHP scripts which would be the case if you amended either of the 2 PHP.INI files
ini_set ('max_execution_time', -1);
When you were trying to amend the php.ini file I would guess you were amending the wrong one, there are 2, one used only be the PHP CLI and one used by PHP running with Apache.
For future reference to find the actual file used by php-apache just do a
<?php
phpinfo();
?>
And look for Loaded Configuration File
I finally worked out the reason the request times out. The problem lies with having virtual server hosting.
The request from the web browser is sent to the hosting server which then directs the request to the virtual server (acts like a separate server). Because the hosting server doesn't get a response back from the virtual server after 60 seconds, it times out and sends a response back to the web browser saying exactly this. Meanwhile, the virtual server is still processing the script.
When the virtual server finally finishes processing the script, it is too late as the hosting server has already returned a timeout error to the front-end user.
Because the hosting server is used to host many virtual servers (for multiple different users), it is generally not possible to change the timeout settings on this server.
So, final verdict: The timeout error cannot be avoided with virtual hosting. If this is a serious issue, you may need to look into getting dedicated server hosting.
Michael,
Your problem should come from the PHP file and not the web browser accessing it.
Did you try putting the following lines at the beginning of your PHP file ?
set_time_limit(0);
ini_set ('max_execution_time', 0);
PHP has 2 configuration files, one for Apache and one for CLI, which explains why when running the script in command line, you don't have a timeout. The phpinfo you gave me has a max_execution_time at 6000
See set time limit documentation.
For CentOS8, the below settings worked for me:
sed -i 's/default_socket_timeout = 60/default_socket_timeout = 6000/g' /etc/php.ini
sed -i 's/max_input_time = 60/max_input_time = 30000/g' /etc/php.ini
sed -i 's/max_execution_time = 30000/max_execution_time = 60000/g' /etc/php.ini
echo "Timeout 6000" >> /etc/httpd/conf/httpd.conf
Restarting apache the usual way isn't good enough anymore. You have to do this now:
systemctl restart httpd php-fpm
Synopsis:
If the script(PHP function) takes 61 seconds or above, then you will get a gateway timeout error. The term Gateway is referred to as the PHP worker, meaning the worker timed out because thats how it was configured. It has nothing to do with networking.
php-fpm is a new service in CentOS8. From what I gathered from the internet (I have not verified this myself), it basically has executables(workers) running in the background waiting for you to give it scripts (PHP) to execute. The time saving is the executables are always running. Because they are already running you suffer no start-up time penalty.

Minify 2.1.5 works on localhost but not on live server due to document root issue

The problem:
My /min/ files are in different places for my local and live server:
localhost/min/
/mnt/foo/bar/hello/example.com/web/content/min/ (from $_SERVER['DOCUMENT_ROOT'])
Min works fine on my localhost but after uploading the same files via ftp to my live server I get a 400 bad request error. This is clearly due the different document roots.
What I've tried:
Changing the $min_documentRoot = '' variable in config.php file to
$min_documentRoot = '/mnt/foo/bar/hello/example.com/web/content/'
Still gives the same errors.
Additional info:
My requests look like this:
src="/min/b=js&f=jquery.min.js,bootstrap.min.js,site.js"
What am I doing wrong?
how are you minifying? is it with zip?
make sure that the live and local servers both have the same setup. especially, make sure that they both have the required minification functions available to them. for example, the Zip PHP functions are not installed by default in Fedora installations.
check the online server's httpd error log

Max execution time

I have a site which is hosted on a dev site for demonstration to the client, and everything works without problem. However, when I download the files and database to my local EasyPHP installation, I receive the following error:
Fatal error: Maximum execution time of
30 seconds exceeded in C:\Program
Files (x86)\EasyPHP-5.3.4.0\www\PC
Estimating\classes\database.class.php
on line 23
The database details for the connection are correct, as the Database object is already used on part of the template before this error is shown.
My question is, why does the system work fine on a live server, but not on EasyPHP?
You should check the max_execution_time setting in the php.ini files on your server and on your local installation.
btw... what is done in line 23 ?
Copied from my comment to make it easier to find the solution:
some things really runs slower on windows... while on mac/unix the php connects to mysql using a file socket while it should use tcpip in windows. Try using "127.0.0.1" instead of "localhost" when connecting to the db
These issue have two possible solutions:
1) Increase the max_execution_time in your php.ini . First of all, locate this file, and then edit it. Locate this line:
max_execution_time=30
and replace by:
max_execution_time=120
And then restart your webserver.
This will increase from 30 seconds to 120 seconds. You can increase even more, according to your application needs.
2) If this setting doesn't solve the problem, you may have to look into your PHP application, because there may be an infinite loop or something similar.
More details about this problem:
https://www.copahost.com/blog/increase-php-max-execution-time/
Because your PC is slow compared to server and/or your code is really badly optimised

GWT with -noserver

I'm making a GWT project that uses PHP to connect to a DB2 database. When I compile the project and deploy it to the server (copy the contents of the WAR directory over), it works fine, obviously in hosted mode I run into the SOP issue since GWT is on port 8888 while the php script is running on port 80.
I'm trying to get the -noserver option to work but I must be missing something.. I went back and created the basic sample app from the command line (webApplicationCreator -out /home/mike/gwt/sample1)
I edited the build.xml to include the -noserver and -port 80 arguements for devmode. I want my app to be hosted at localhost/sample1 so I edited the -startupUrl to the whole URL I want to use: http://localhost/sample1/sample1.html
I compiled (ant), copied over the sample1.html, sample1.css from war to the webserver sample1 directory, and the (md5).gwt.rpc, clear.cache.gif, sample1.nocache.js and hosted.html files from the war/sample1 to sample1/sample1 directory as described in the GWT documentation (no history.html file was created).
I then run ant devmode from the project directory (/home/mike/gwt/sample1)
I can get to the sample1.html page, but when I click the button to send the name to the server it returns with
Remote Procedure Call - Failure
Server replies:
An error occurred while attempting to contact the server. Please check your network connection and try again.
I turned on firebug and it's returning a 404 for http://localhost/sample1/sample1/greet. This is where I'm stuck.. this file obviously doesn't exist on my webserver.. but why? Isn't this something that is supposed to be getting compiled by GWT?
Can anyone give me a hand? Thanks!
So, basically you've copied over the client-side of a client/server application. When your GWT client application attempts to make a Remote Procedure Call (RPC) to the server to a greeting service that is part of the initial sample, it can't find that service.
If you wanted to copy that service over, you'd need to have a Java application server, copy over the GreetingService, the web.xml that references it and possibly a few other things (I'd have to check in more detail). That doesn't sound like what you actually want, so either you'll want to build a GWT-RPC service in PHP that responds to that URL, or remove the reference in the GWT code to RPC call to the greeting service.
With a PHP back-end, you're probably not going to use GWT-RPC, I'm guessing that you're more likely to use JSON or XML, and if that's the case, then I'd go with removing the RPC call altogether for now.
Does this all make sense? Feel free to ask for further clarification.
To solve the SOP issue, I used the HttpProxyServlet to proxy the HTTP requests to my webserver through the development server.
Download httpProxyPackage.jar, copy it into WEB-INF/lib/, and configure it like so in WEB-INF/web.xml (this is for the StockWatcher tutorial, assuming your web root is the folder that contains the StockWatcher directory):
<servlet>
<servlet-name>jsonStockData</servlet-name>
<servlet-class>com.jsos.httpproxy.HttpProxyServlet</servlet-class>
<init-param>
<param-name>host</param-name>
<param-value>http://localhost/StockWatcher/war/stockPrices.php</param-value>
</init-param>
</servlet>
<servlet-mapping>
<servlet-name>jsonStockData</servlet-name>
<!--
http://127.0.0.1:8888/stockPrices.php in dev mode
http://gwt/StockWatcher/war/stockPrices.php in prod mode
-->
<url-pattern>/stockPrices.php</url-pattern>
</servlet-mapping>
Then redefine your JSON URL as:
GWT.getHostPageBaseURL() + "stockPrices.php?q=";
instead of:
GWT.getModuleBaseURL() + "stockPrices.php?q=";
It’s maybe not the best way, but if it can get someone else started… There was another way using php-cgi, but I didn’t have it installed.

Apache: reverse proxy to process PHP from another server

I have the following setup:
Plain-Server: Delivering php-files as plain text
Proxy-Server: Asking the Plain-Server for the php file and parsing it.
Now my question: How do I configure the Proxy-Server (a fully configurable apache 2.2 with PHP 5.3) to interpret the plain php files from Plain-Server?
Example: Given a small php script "hello.php" on Plain-Server (accessible throw http://plainserver/hello.php):
<?php
echo "Hello World";
?>
Plain-Server only outputs it as plain text, no parsing of php-code.
On the Proxy-Server the file "hello.php" does not exist. But when requesting hello.php from Proxy-Server it should get the hello.php from Plain-Server with mod_proxy (Reverse Proxy). It should also parse and execute the php, saying only "Hello World".
The Reverse Proxy is already running, but the execution of php code not working. I tried mod_filter, but couldn't is work. Any ideas how to that?
You may consider instead sharing the php files from your source server via an nfs mount or something similar to your target server. Tricking the proxy server into doing what you ask seems like the long way around the barn?
I totally agree with jskaggz,
you could build some awfull tricks building some apps that fetch the remote page ,
dowload it into a local file and then redirect the user to that page that could be executed...
but there is a milion security issues and things that might go wrong with that...
Can't you just convert the 'plain server' to a php excuting server and do some traditional reverse proxying
on your 'proxy server'
maybe using mod_proxy:
http://www.apachetutor.org/admin/reverseproxies ?
Answered this on the ServerFault version of this thread: https://serverfault.com/a/399671/48061

Categories