Intermittently failing to open stream (HTTP request) - php

I am running Windows Server and it hosts my PHP files.
I am using "file_get_contents()" to call another PHP script and return the results. (I have also tried cURL with the same result)
This works fine. However if I execute my script, then re execute it almost straight away, I get an error:
"Warning: file_get_contents(http://...x.php): failed to open stream: HTTP request failed!"
So this works fine if I leave a minute or two between calling this PHP file via the browser. But after a successful attempt, if I retry too quickly, then it fails. I have even changed the URL in the line "$html = file_get_contents($url, false, $context);" to an empty file that simply prints out a line, and the HTTP stream still doesn't open.
What could be preventing me to open a new HTTP stream?
I suspect my server is blocking further outgoing streams but cannot find out where this would be configured in IIS.
Any help on this problem would be much appreciated.
**EDIT: ** In the script, I am calling a Java file that takes around 1.5 mins, and it is after this that I then call the PHP script that fails.
Also, when it fails, the page hangs for quite some time. During this time, if I open another connection to the initial PHP page then the previous page (still hanging) then completes. It seems like a connection timeout somewhere.
I have set the timeout appropriately in IIS Manager and in PHP

Related

setting Timeout for SplFileObject reading from remote aws s3 file

I am reading a file line by line directly from aws s3 server using SplFileObject.
$url = "s3://{$bucketName}/{$fileName}";
$fileHandle = new \SplFileObject($url, 'r');
$fileHandle->setFlags(SplFileObject::READ_AHEAD);
while(!$fileHandle->eof()){
$line = $fileHandle->current();
// process the line
$fileHandle->next();
}
That works perfectly fine in 99% of the cases except at times when loop is running and there is a temporary network glitch. Script being unable to
access the next line from s3 for x seconds exits prematurely.
And the problem is you never know if the script completed its job or it exited due to timeout.
My question here is
1- is there a way to explicitly set timeout on SPLFileObject while accessing remote file so that when the loop exits, I can understand if it exited due to time out or if the file really reached the eof.
I checked the stream_set_blocking and stream_set_timeout but they both do not seem to work with SplFileObject.
2- What is the timeout setting that this script is currently following? is it socket_timeout ? or stream_timeout or curl timeout? or simply php script timeout (which is highly unlikely I guess as the script runs on command line ).
Any hints?

php's file_get_contents provokes crash

I am having some trouble with PHP's file_get_contents function.
My setup is basically as follows:
-> From a php page (let's call it a.php), a POST request is sent to another php page (b.php) through file_get_contents.
-> b.php does some stuff with the POST input and then sends another POST request through file_get_contents to itself (b.php)
->This is repeated a couple of times (for example 4 times), so basically it looks like this:
a.php -> POST request through file_get_contents -> b.php -> POST request through file_get_contents -> b.php -> POST request through file_get_contents -> b.php -> POST request through file_get_contents -> b.php
At the last post request to b.php, the script echoes something to the "poster", he adds something to it etc. etc. all the way back to a.php.
For clarity's sake: in production all those php files will be on different servers, and each server has an added value in the process.
For testing however, all pages are on the same server (and I add "?server=x" to the URL so that the same file uses a different database at every "call").
This works like a charm :) ... Unless there are more than 5 file_get_contents are "active" simultaneously ...
This works fine:
a.php->b.php->b.php->b.php->b.php
This doesn't:
a.php->b.php->b.php->b.php->b.php->b.php
As a matter of fact it crashes my server (not responding to ANY http requests anymore), and only restarting Apache "deblocks" it.
The same happens when I load the working "circuit" a.php->b.php->b.php->b.php->b.php different times from the browser.
Error in the Apache error log:
failed to open stream: HTTP request failed!
I thought it might be related to the POST size being too large, but sending a HUGE POST request through the a.php->b.php->b.php->b.php->b.php circuit works just fine ....
So it looks like somehow only 5 simultaneous file_get_contents are allowed ...
Anyone's got some ideas ?
EDIT: As mentioned below, it looks like the real problem is a deadlock, which will not happen in PRD since there will be no "loop" on the same server ... I solved this issue by using CURL instead with a timeout. When a deadlock risks to occur, the curl requests simply time out without freezing the server.
I'm still interested however to get an answer to this question: How can I check/reconfigure the amount of simultaneous requests in Apache2? It's not in the conf file afaik.
Thanks !!!
I would suggest using CURL instead of file_get_contents in this case - although I've never had an issue with this except(!) when using sessions since they lock the process up till the session closes.

PHP script on Amazon EC2 giving response 324 on browser

We have a script which downloads acsv file. When we run this script on command line on EC2 console it runs fine; downloads the file and sends success message to the user.
But if we run through a browser then we get:
error 324 (net::ERR_EMPTY_RESPONSE): The server closed the connection without sending any data.
When we checked in backed for the file download, it's there but the success message sent after the download is not received by the browser.
We are using cURL to download from a remote location with authentication. The user group and ownership of the folder is "ec2-user", the folder has full rights ie 777.
To summarize: the file is downloaded but at the browser end we are not getting any data or success message which we print.
P.S.: The problem occurs when the downloaded file size is 8-9MB; if it is a smaller file size say 1MB it works. So Either script executing time or download file size or some ec2 instance config is blocking it from giving browser a response. The same script is working perfectly fine on our Godaddy Linux VPS. We have already changed Max execution time for the script.
Sadly, this is a known problem without a good solution. There's a very long thread on the amazon forum here: https://forums.aws.amazon.com/thread.jspa?threadID=33427. The solution offered there is to send a keep-alive message to keep the connection from dying after 60 seconds. Not a great solution, but I don't think there's a better one unless Amazon fixes the problem, which doesn't seem likely given that the thread has been open for 3 years.

PHP asynchronous multiple file write and read

I am using a cURL based php application to make requests to another webserver that does asynchronous requests. So what I am doing is creating files with the name as .req with the info I will need on the return and as the identification in the request. The requests are done using HTTP-XML-POST. The file is written using: -
file_get_contents(reqs/<databaseid>.req, FILE_APPEND);
What happens is that while the requests are being generated in bulk (about 1500 per second), the responses start coming back from the webserver. The response is caught by a another script which received the from the response and opens the request file based on it using: -
$aResponse = file(reqs/<databaseid>.req);
Now what happens is that in about 15% of requests, the file() request fails and generates a log entry in apache log like this: -
file(reqs/<databaseid>.req): failed to open stream: No such file or directory in <scriptname> on line <xyz>
It has been verified using a cleaner script that runs later that the file did exist.
Any ideas?!!!
There are some functions to handle simultaneous file access such as flock() but it's normally easier to simply use a database. Any decent DBMS has already worked it out for you.

php file_get_contents fails sometimes during cronjob

I am trying to run a php script via a cronjob and sometimes (about half the time) I get the following warning:
PHP Warning: file_get_contents(http://url.com): failed to open stream: HTTP request failed! in /path/myfile.php on line 285
The program continues to run after that which makes me think it is not a timeout problem or a memory issue (timeout is set to 10 minutes and memory to 128M), but the variable that I am storing the results of that function call in is empty. The weird part is that I am making several other calls to this same website with other url parameters and they never have a problem. The only difference with this function call is that the file it is downloading is about 70 mb while the others are all around 300 kb.
Also, I never get this warning if I SSH into the web server and run the php script manually, only when it is run from a cron.
I have also tried using cURL instead of file_get_contents but then I run out of memory.
Thanks, any help here would be appreciated.
Perhaps the remote server on URL.com is sometimes timing out or returning an error for that particular (large) request?
I don't think you should be trying to store 70mb in a variable.
You can configure cURL to download directly to a file. Something like:
$file = fopen ('my.file', 'w');
$c = curl_init('http://url.com/whatever');
curl_setopt($c, CURLOPT_FILE, $file);
curl_exec($c);
curl_close($c);
fclose($file);
If nothing else, curl should provide you with much better errors about what's going wrong.
From another answer .. double check that this issue isn't occurring some of the time with the URL parameters you're using:
Note: If you're opening a URI with special characters, such as spaces, you need to encode the URI with urlencode() - http://docs.php.net/file%5Fget%5Fcontents

Categories