I'm having some problems with my PHP server. Most of the functions when run gives the same error.
Warning: fopen(http://www.ietf.org/rfc/rfc2475.txt) [function.fopen]: failed to open stream:
A connection attempt failed because the connected party did not properly respond after a
period of time, or established connection failed because connected host has failed to respond. D:\inetpub\vhosts\coolfbapps.in\httpdocs\test\merger2.php on line 3
Fatal error: Maximum execution time of 30 seconds exceeded in D:\inetpub\vhosts\coolfbapps.in\httpdocs\test\merger2.php on line 3
The same error comes when i use imagecreate functions, get_image functions.
I talked to service providers but they said I should tell them the cause of this error so that they can rectify it. Please se if any one can make out what changes should be done to server to remove those errors.
CODE USED
$ch = curl_init("http://www.gravatar.com/avatar/95111e2f99bb4b277764c76ad9ad3569?s=32&d=identicon&r=PG");
$fp = fopen("http://www.ietf.org/rfc/rfc2475.txt", "r");
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
curl_close($ch);
fclose($fp);
Sorry for this, but I'm unable to comment yet. You need to post the code that is causing these errors. Are you explicitly requesting that URL?
To debug this, you should logon to the server and attempt to request the file to see if you're actually able to have outbound connections on port 80. To do this on a linux server just run
wget http://www.ietf.org/rfc/rfc2475.txt
.. and see if it fails or not. If it does, you need to talk to your hosting provider / ISP.
If you don't have access to the server you could simply try (in PHP):
<?php
file_get_contents('http://www.google.com/'); // Google so that it's not the same URL
?>
If there's an error, same as above.
The first error (the fopen() one) is caused because of a timeout in the response of the server you're trying to load the data from. i.e. the server is taking too long to respond, so the connection times out.
The second error is that the script is running for too long. PHP has a setting called max_execution_time so that a script can't eat up all the resources on a server. Your server has a setting to allow 30 seconds to execute, or die() with a fatal error.
Seems your server can't connect to these sites. Perhaps your server is not allowed to start outgoing connections or is blocking that outgoing HTTP traffic somehow?
I bet this isn't a PHP problem but instead has to do with your server connection. If you have SSH access, try to open these URLs from the command line without PHP.
Related
After reading the following links:
get url content PHP
file_get_contents failed to open stream: Connection refused godaddy server with remote connection server
PHP file_get_contents() returns "failed to open stream: HTTP request failed!"
I'm having a problem with the function file_get_contents or even curl in one of my servers at hostgator, they are not working, returning the PHP error failed to open stream: Connection refused. I've tried cURL with USERAGENT set with no result too. It's a simple weather service that I'm creating, it returns the altitude, wind direction, speed and temperature on a certain coordinate of the globe.
Return sample: 30000;221;2;-32;1;
In the other side (request side), I have a web server running IIS 7.5, with all router firewalls, computers firewalls and antivirus softwares disabled only for testing, and is still refusing the connection ONLY for the hostgator servers. I've tried the same code in other web hosting providers, and the code is working properly.
This service will handle a lot of requests per minute, and this seems to me that something has blocked my connection between hostgator and my server due to the number of requests. But I don't know where!!
The page is perfectly accesible via browser.
This is my enviorioment at the hostgator side:
allow_url_fopen: On
allow_url_include: On
OpenSSL: Enabled
Here is my PHP code:
$datalink = "http://#####.########.###:8280/weather.php?waypoint_lat=-10.981925&waypoint_lon=-37.077377&altitude=30000";
$weather_layer = file_get_contents($datalink);
echo "Layer ($datalink):" .$weather_layer."<br>";
Isn't Hostgator blocking the requests because of the DDoS protection? Give them a call, my hosting provider was blocking connections to my other server because they were thinking it was some hacker DDoSing using my hosting.
Also, there might be problem with the port in URL - Hostgator cannot proccess it?
I have a php file and all it contains is
<?php
ini_set('display_errors', 1);
ini_set('display_startup_errors', 1);
error_reporting(E_ALL);
echo file_get_contents("http://mywebsite.com/javascript-function.php");
?>
And for some reason it displays the following notice:
Notice: file_get_contents(): send of 24 bytes failed with errno=104 Connection reset by peer in /home/sites/mywebsite.com/public_html/index.php on line 6 Notice: file_get_contents(): send of 2 bytes failed with errno=32 Broken pipe in /home/sites/mywebsite.com/public_html/index.php on line 6
I have never come accross this message before so I have no idea what to do to solve it.
I have also tried using cURL but it outputs nothing and no error message.
A connection reset by peer error occurs in a datastream connection when either the remote host you are connecting to (i.e. mywebsite.com which you specified in the call to file_get_contents) terminates the socket connection on their end before the client is finished sending the request, or when the local network system detects a failure in connecting.
Some common root causes could be a firewall rule that is blocking the connection on either end or possibly a miss-configured web server. One way to narrow down the problem is to attempt accessing the same URL from a web browser on the same client that this script was run when the error occurred. If it works as expected, at least you know, it's not a firewall issue on the client. Begin digging into the web server's config files to troubleshoot the problem further. However, if the same problem occurs in a web browser then you should begin looking into your firewall rules on that client as well as the host's firewall rules if any.
I have an ecommerce website that has been running for several months with no code changes (and for several years with only minimal changes to the card processing path). I now have a problem where when first opening a connection to the credit card processor secure server, the connection fails. On a second (or third, or fourth, etc.) attempt the connection succeeds. After some length of time--perhaps 5 minutes--the initial connection will fail again and subsequent connections will succeed.
Sample code that comes from the credit card processor's PHP API file:
$url = 'https://esplus.moneris.com:443/gateway_us/servlet/MpgRequestArray';
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,$url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER,1);
curl_setopt ($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_POSTFIELDS,$dataToSend);
curl_setopt($ch,CURLOPT_TIMEOUT,$gArray[CLIENT_TIMEOUT]);
curl_setopt($ch,CURLOPT_USERAGENT,$gArray[API_VERSION]);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, TRUE);
$response=curl_exec ($ch);
if(!$response) {
print curl_error($ch);
print "\n";
print curl_errno($ch);
print "\n";
} else {
print "Success\n";
}
Output:
% php tester_curl.php
error:140770FC:SSL routines:SSL23_GET_SERVER_HELLO:unknown protocol
35
% php tester_curl.php
Success
% php tester_curl.php
Success
% php tester_curl.php
Success
There are some similar questions, but I haven't been able to resolve the problem and I do not see any with the same error message and symptom of subsequent connection attempts succeeding after an initial fail, e.g.:
Unable to establish SSL connection, how do I fix my SSL cert?
curl errno 35 (Unknown SSL protocol error in connection to [secure site]:443) (same error message)
How to fix cURL SSL connection timeout that only happens the first time the script is called (different error msg, but SSL connections fails first attempt, subsequently succeeds)
The server is kind of broken. It does support TLS1.2 and TLS1.0, but not TLS1.1 (replies with TLS1.0 which is ok). This is usually not a problem unless you have client code which tries to enforce specific protocols by excluding others.
The behavior you describe looks like a client which downgrades the connection on failed connection, keeps this downgrade cached for a while but retries with the originally failed version again after some time. To trace the problem down:
check if the problem is also with other servers
check if other clients have the problem with the same server
check the underlying implementation. Curl can use GnuTLS, NSS, OpenSSL and maybe more. From the error message it looks like OpenSSL, but which version?
check for any middlebox (firewall, load balancer...) in the path to the server which might cause problems
do a packet capture and post it here in a form usable with wireshark (e.g. cloudshark)
For more information on how to debug this kind of problems and which additional information would be useful check http://noxxi.de/howto/ssl-debugging.html#aid_external_debugging
I had the exact thing happen with a client using an old VirtualMerchant gateway. It started failing at 5:00 PM on a Monday and magically started working again at 10 AM the next day.
Whether on the command line via openssl, or curl, or through curl in PHP the connection would fail the first time and then if you ran the same command a second latter it would work.
I tried forcing IPv4 (instead of IPv6), setting timeouts, forcing different protocols, downgrading openssl, etc, and none of it work.
The assumption is that this was something DNS and/or server related on the gateway side because nothing we did fixed it and it fixed itself.
We were running an older openssl that only supported up to TLS 1.1, but it was working and then it started working again, so it wasn't only our client. Though, the age of our client must have been part of the issue because other newer clients didn't experience the "first attempt failure" during the same window of time.
Long story short, if this happens it's probably not YOU (besides you have an older OpenSSL) and the other gateway/server you're calling likely will possibly need to fix/tweak something for it to start working again.
Keep in mind, openssl is part of the Linux core packages, so you can't simply upgrade openssl without serious risk of messing up your server. You'll have to upgrade to a newer version of the operating system to get a more modern openssl.
I'm trying to write a Joomla module which will parse json data from springer api. I have problem with the method "file_get_contents" and other replacements. My problem is that:
Warning: file_get_contents(http://www.example.com) [function.file-get-contents]: failed to open stream: A socket operation was attempted to an unreachable network. in C:\wamp\www\modules\mod_springer\mod_springer.php on line 72
After some search, I found out that it may be because of my company's firewall. Is there any way to overcome this problem like changing port I'm using or using another method, or am I stuck here?
Note: allow_url_fopen is enabled. I'm using wamp.
You're not stuck if you can convince the powers-that-be to allow you access through the firewall to the remote API to which you wish to connect. As long as you have a legitimate reason and the firewall access can be provisioned in a narrow scope (one specific IP and port), I don't see why you should have a problem getting this access.
Download cacert.pem file from here
Copy cacert.pem file to for example c:/wamp/bin/php/extras/ssl folder
Write or uncomment in php.pni curl.cainfo ="c:/wamp/bin/php/extras/ssl/cacert.pem" and save
Restart Wamp/Xampp server
DONE
It may be possible (MAYBE) to overcome the issue with cURLs proxy handling like:
curl_setopt($ch, CURLOPT_PROXY, "http://xxx.xxx.xxx.xxx:8080");
curl_setopt($ch, CURLOPT_PROXYPORT, 8080);
curl_setopt ($ch, CURLOPT_PROXYUSERPWD, "xxx:xxx");
This would depend on a few things like the permissions you have, if it is just a blocked port or if it is access control.
I have a script with line:
$id = strtolower(implode('',file($ip_service . $ip)));
When executed the file function will be like file(http://www.url.com/filename.php?119.160.120.38)
On server A it works fine but on Server B it gives following error:
file(http://www.url.com/filename.php?xxx.xxx.xxx.xxx) [function.file]: failed to open stream: Connection timed out in /home/path/filename.php on line 22
Line 22 is the above line of code.
Server A has PHP 4.4.6, Server B has 4.4.8
Any help will be highly appreciated.
Using file() and/or file_get_contents() for accessing files across the internet is fairly error prone. For example, I don't think they follow redirects. The timeout period is also very short, not the typical network timeout. It's also difficult to do error capture to see why it failed.
I always use CURL for accessing files over the network. It takes a few more lines of code, but is much more reliable. Note, PHP CURL support may not be installed in your setup.
$ch = curl_init();
curl_setopt($ch, CURLOPT_TIMEOUT, 10);
curl_setopt($ch, CURLOPT_URL, 'http://www.url.com/filename.php?119.160.120.38');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
$result = curl_exec($ch);
if ( $result==false ) {
echo curl_errno($ch).' '.curl_error($ch);
}
curl_close($ch);
Obviously you would want to do something beside echo out the error message if there was an error. The $result variable will have the contents of the file if it succeeds.
Your server couldn't establish a tcp/ip connection to the server hosting www.url.com in a given time period (20 seconds? 30 seconds? What ever the default is or what you have specified as timeout). The other server didn't even reject the connection actively, there just wasn't any response at all. Could be e.g. a firewall issue where some or all of your or the other server's packets where silently dropped.