Ok i have been struggling with this for a couple of days now.
I have a Joomla installation on a local machine on our network for our intranet, also installed is Jomsocial.
The problem is that when i go to site configuration or edit an event or navigate to any joomla module that calls an external api, i get either
CURL error : 7 Failed to connect to maps.google.com port 80: Connection timed out
or
Connection timed out (110)
The issue is definetely not Joomla or Jomsocial a i have other php applications running on the same server that also cant contact external api's
The server setup is
Ubuntu 14Lts
PHP 5.5
Apache 2.4.7
MariaDB
Server sits behind a proxy, but has full internet access from CLI. all the necessary php extensions is enabled. i have set the global proxy variable in /etc/environment also in apt config and set the proxy variable in Joomla. My Joomla updates and components updates work fine but not curl of fsockopen functions are working.
I have no idea where else to look for the error. My thinking is that the www-data user might not have sufficient privileges to execute fsockopen and curl from a browser.
Any advice?
UPDATE, i have tested the site on another machine which is not on the corporate network (directly connected to the internet) and everything works. So i am pretty certain that my issue is on my machine and permissions on the network, specifically my www-data user. How can i fix this?
It appears that the http_proxy variable is not used by PHP (mod_php) even if PassEnv is used to pass it, or if it directly set with SetEnv. In addition, it is displayed correctly when getenv('http_proxy') is called in a PHP script.
However, there are two ways to get it working:
Set it in the Apache envvars (/etc/apache2/envvars) as follows:
export http_proxy=http://proxy.example.com:8080/
and restart Apache.
Put in the PHP files that load the application (e.g. index.php, bootstrap.php and etc.):
putenv('http_proxy=http://proxy.example.com:8080/');
Again, if you test with getenv('http_proxy') you will see that they are set correctly.
I've just had the same problem with a pretty close setup (only difference is mysql instead of MariaDb, and Joomla 3.4.1) and it took me quite a while to get everything together, so I will put the list of possible stumbling blocks here:
Make sure php5-curl is installed. Joomla can use a proxy only with CURL as transport layer.
sudo apt-get install php5-curl
I found no use in entering the proxy in the Joomla configuration. The only good it did was that the update connection would not time out but return immediately.
It is not enough to place the environment variables in /etc/apache2/envvars, you also need to use "PassEnv" in /etc/apache2/apache2.conf,
i.e. (taken from https://stackoverflow.com/a/21571588/1967646)
PassEnv http_proxy
Also, I needed to pass both HTTP_PROXY, HTTPS_PROXY as xml-lists were fetched via http and files lateron via https (probably update files from github). Possibly, you need to have these variables in lower case but on the joomla configuration page "PHP information" similarly named variables show up in upper case.
I don't know where this really made any difference, but restarting apache2 as follows seems to be the right way (instead of apache2ctl).
sudo service apache2 restart
I put together some haphazard code for testing whether curl and php would work together or not, most of it comes from https://stackoverflow.com/a/1477710/1967646. I only added plenty of error reporting. Put in a file test.php in the webfolder's root dir and look at with your favorite browser.
<?php
ini_set('display_errors', 'On');
error_reporting(E_ALL);
$url = 'http://update.joomla.org/core/list.xml';
function get_page($url, $proxy=true) {
if ($url!='') {
$ch = curl_init ();
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
if ($proxy) {
curl_setopt($ch, CURLOPT_PROXY, '<enter your proxy host here>');
curl_setopt($ch, CURLOPT_PROXYTYPE, CURLPROXY_HTTP);
curl_setopt($ch, CURLOPT_PROXYPORT, <enter your proxy port here>);
}
if (! $html = curl_exec($ch)) {
echo '<br>Last CURL error is '.curl_error($ch).'<br>';
} else {
echo '<br>CURL without error.<br>';
}
curl_close($ch);
return $html;
} else {
echo 'Empty URL.';
}
}
echo 'Hello, getting pages via curl:';
$html=get_page($url);
var_dump($html);
echo bin2hex($html);
echo '<br>';
var_dump(get_page($url, false));
echo '<br>done.<br>';
?>
Use this:
export http_proxy=http://your.proxy.server:port/
or this:
From man curl:
-x, --proxy <[protocol://][user:password#]proxyhost[:port]>
Use the specified HTTP proxy.
If the port number is not specified, it is assumed at port 1080.
Related
I work remotely and can access internal servers via VPN. When my VPN connection is established, I can reach my webserver via curl:
curl http://sub.mydomain.com
I can also reach the webserver in a browser by going to http://sub.mydomain.com. So this does not seem to be a DNS issue with the webserver itself.
When developing my Laravel 4.2 application (PHP 5.6) served locally via Apache, however, php's curl_exec fails to resolve the host. Oddly, php's gethostbyname($hostname) correctly resolves the host. I have tried forcing IPv4 as I have read IPv6 can result in failures of this type with no success.
// works
$ip = gethostbyname($hostname);
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "http://$ip/path");
curl_setopt($ch, CURLOPT_IPRESOLVE, CURL_IPRESOLVE_V4);
curl_exec($ch);
// does NOT work
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "http://$hostname/path");
curl_setopt($ch, CURLOPT_IPRESOLVE, CURL_IPRESOLVE_V4);
curl_exec($ch);
This leaves me at a loss. I don't understand how PHP curl handles DNS resolution (clearly not with gethostbyname). I also don't understand exactly how DNS lookups on private networks work in the first place. So I really don't know where to look to get PHP curl to resolve my private hosts.
Command line curl resolved the host. Browser resolved the host. Only PHP curl failed to resolve it.
Ultimately, the issue came down to the curl configuration. I installed PHP with homebrew and as a dependency it installed curl-openssl to be used by PHP. This install of curl is configured by the brew formula to use c-ares for domain name resolution. I don't know how c-ares works, but this VPN DNS is apparently an edge case it does not handle correctly on OS X (perhaps due to OS X doing a lousy job of keeping /etc/resolv.conf up to date).
/usr/bin/curl on the other hand was configured to use the native OS X resolver. This is the same resolver used by PHP's gethostbyname and the web browser, which explains why both of those work as expected.
$ brew uninstall --ignore-dependencies curl-openssl
This resolved my issue by dumping this "broken" curl installation. I am not sure how the fallback mechanism works, but I believe PHP is now using /usr/bin/curl since I have no other installations of curl (that I know about) and the curl version listed in phpinfo() now matches /usr/bin/curl when before it did not.
I have this simple php script:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'https://www.instagram.com/zuck/');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$output = curl_exec($ch);
curl_close($ch);
echo htmlspecialchars($output);
I have Apache 2.4.17, PHP : 5.6.16 (I also tried with PHP 7)
I have tried running it on a remote host server and it works just as expected. However it doesn't work at all on my PC.
I tried WAMP, XAMPP, disabled firewall, connected directly to my modem (without router), checked the php.ini and c_url is uncomented. I also tried downloading a fix from http://www.anindya.com/. Doesn't work as well. When i try curl_version it works (so i guess c_url is loaded) but this script doesn't. And the strange thing is there are no errors just a blank page.
I really don't have any more troubleshooting ideas
After some testing I found the problem:
First I checked for erros in the script itself with:
echo curl_error($ch)
which returned this:
SSL certificate problem: unable to get local issuer certificate
Turns out I had to disable SSL certificate verification because all the websites i had tried used SSL (eg instagram, google, etc.)
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false)
This question is marked as answered but for those who see this in future, the comment of #amphetamachine is important. Setting CURLOPT_SSL_VERIFYPEER to false is not a good idea. It will work on your local server but do you really want this on the remote server?
Rather than having to remember to comment out this line for production (or make it conditional on the environment), I suggest you add to your php.ini file the absolute path to the Certificate Authority file (where you will have already uncommented the cURL extension).
curl.cainfo ="your absolute local path\cacert.pem"
This file can be downloaded if you don’t have it.
That way your local test system for will work and you will not compromise your production setup.
You could set the path in the cURL option CURLOPT_CAPATH but again you would not want this in your production code.
I have Digitalocean LEMP application dropplet configured to host my website (PHP - CodeIgniter application), I'm using Nexmo SMS API in my application and everything worked well until today. Suddenly, now when I try to send SMS I have this error "Could not resolve host: rest.nexmo.com". Looks like cURL is not working, but when I checked if it is installed - it is, and my phpinfo shows it as well. I restarted nginx, php5-fpm, tried some different curl settings in my code but always got this error.
When I tried to run simple script like this:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://www.google.com/');
curl_setopt($ch, CURLOPT_VERBOSE, true);
curl_setopt($ch, CURLOPT_HEADERFUNCTION, 'read_header');
curl_setopt($ch, CURLOPT_WRITEFUNCTION, 'read_body');
curl_exec($ch);
if ($error = curl_error($ch)) {
echo "Error: $error<br />\n";
}
function read_header($ch, $string)
{
$length = strlen($string);
echo "Received $length bytes<br />\n";
return $length;
}
result is still "Error: Could not resolve host: www.google.com" so i think the problem is in cURL and not in my application code (i'm using NEXMO CodeIgniter library).
I tried everything that comes in my mind and now I'm running out of ideas, so every help is appreciated. Are there any special settings/things to do to connect/make cURL works with nginx that I'm missing? Maybe something in nginx.conf files, or do I need to open some ports, etc?
Note that the error is "Could not resolve host". This points not to curl, but to the system resolver library. Curl does not, by itself, do DNS lookups, but instead uses the system standard methods, usually using libresolv. If you use a system call trace utility like strace you will see that the resolver is then controlled by /etc/nsswitch.conf, /etc/host.conf and /etc/resolv.conf. Your first point of call should be there. You can test that it is system and not curl by using a standard PHP file call, like:
$web_content = file_get_contents("http://www.google.com/");
This should return the body if it can resolve the host.
I send an item code to a web service in xml format using cUrl(php). I get the correct response in localhost, but when do it server it shows
cURL Error (7): couldn't connect to host
And here's my code:
function xml_post($post_xml, $url)
{
$user_agent = $_SERVER['HTTP_USER_AGENT'];
$ch = curl_init(); // initialize curl handle
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, FALSE);
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_FAILONERROR, 1);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_RETURNTRANSFER,1);
curl_setopt($ch, CURLOPT_TIMEOUT, 50);
curl_setopt($ch, CURLOPT_POSTFIELDS, $post_xml);
curl_setopt($ch, CURLOPT_USERAGENT, $user_agent);
// curl_setopt($ch, CURLOPT_PORT, $port);
$data = curl_exec($ch);
$curl_errno = curl_errno($ch);
$curl_error = curl_error($ch);
if ($curl_errno > 0) {
echo "cURL Error ($curl_errno): $curl_error\n";
} else {
echo "Data received\n";
}
curl_close($ch);
echo $data;
}
I send the item code to the tally and fetch the details from it. I tried using both the versions php 4+ and php5+, nothing works out Any solution.
CURL error code 7 (CURLE_COULDNT_CONNECT)
is very explicit ... it means Failed to connect() to host or proxy.
The following code would work on any system:
$ch = curl_init("http://google.com"); // initialize curl handle
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
$data = curl_exec($ch);
print($data);
If you can not see google page then .. your URL is wrong or you have some firewall or restriction issue.
“CURL ERROR 7 Failed to connect to Permission denied” error is caused, when for any reason curl request is blocked by some firewall or similar thing.
you will face this issue when ever the curl request is not with standard ports.
for example if you do curl to some URL which is on port 1234, you will face this issue where as URL with port 80 will give you results easily.
Most commonly this error has been seen on CentOS and any other OS with ‘SElinux’.
you need to either disable or change ’SElinux’ to permissive
have a look on this one
http://www.akashif.co.uk/php/curl-error-7-failed-to-connect-to-permission-denied
Hope this helps
If you have tried all the ways and failed, try this one command:
setsebool -P httpd_can_network_connect on
In PHP, If your network under proxy. You should set the proxy URL and port
curl_setopt($ch, CURLOPT_PROXY, "http://url.com"); //your proxy url
curl_setopt($ch, CURLOPT_PROXYPORT, "80"); // your proxy port number
This is solves my problem
In my case I had something like cURL Error (7): ... Operation Timed Out. I'm using the network connection of the company I'm working for. I needed to create some environment variables. The next worked for me:
In Linux terminal:
$ export https_proxy=yourProxy:80
$ export http_proxy=yourProxy:80
In windows I created (the same) environment variables in the windows way.
I hope it helps!
Regards!
Are you able to hit that URL by browser or by PHP script? The error shown is that you could not connect. So first confirm that the URL is accessible.
Check if port 80 and 443 are blocked. or enter - IP graph.facebook.com and enter it in etc/hosts file
you can also get this if you are trying to hit the same URL with multiple HTTP request at the same time.Many curl requests wont be able to connect and so return with error
This issue can also be caused by making curl calls to https when it is not configured on the remote device. Calling over http can resolve this problem in these situations, at least until you configure ssl on the remote.
In my case, the problem was caused by the hosting provider I was using blocking http packets addressed to their IP block that originated from within their IP block. Un-frickin-believable!!!
For a couple of days I was totally blocked on this. I'm very very new to networking/vms but was keen to try set it up myself instead of paying a hosting company to do it for me.
Context
I'm rebuilding the server side for an app that uses php routines to return various bits of data from internal sources as well as external APIs for a map based app. I have started an Oracle VM instance and have installed/set up Apache and php. All running totally fine, until one of my php routines tries to execute a cURL. I start implementing error logging to find that I don't even get a message - just '7', despite implementation being very similar to the above. My php routine accessing an internal file for data was running successfully so I was fairly sure it wasn't an Apache or php issue. I also checked my Apache error logs, nothing telling.
Solution
I nearly gave up - there's talk on disabling SELinux above and in other articles, I tried that and it did work for my purposes, but here's a really good article on why you shouldn't disable SELinux https://www.electronicdesign.com/technologies/embedded-revolution/article/21807408/dont-do-it-disabling-selinux
If temporarily disabling it works and like me you don't want to do this (but it confirms that SELinux is blocking you!), I found a neat little command that actually prints out any SELinux issues in a more readable fashion:
sealert -a /var/log/audit/audit.log
This returned the following:
found 1 alerts in /var/log/audit/audit.log
--------------------------------------------------------------------------------
SELinux is preventing php-fpm from name_connect access on the tcp_socket port 443.
Great, I now get a bit more information than just '7'. Reading further down, I can see it actually makes suggestions:
***** Plugin catchall_boolean (24.7 confidence) suggests ******************
If you want to allow httpd to can network connect
Then you must tell SELinux about this by enabling the 'httpd_can_network_connect' boolean.
Do
setsebool -P httpd_can_network_connect 1
This has been mentioned further above but now I have a bit more context and an explanation as to what it does. I run the command, and I'm in business. Furthermore, my SELinux is still set to enforcing, meaning my machine is more secure.
There are many other suggestions logged out, if you're blocked it might be worth logging out/checking out /var/log/audit/audit.log.
I was using cURL to scrape content from a site and just recently my page stated hanging when it reached curl_exec($ch). After some tests I noticed that it could load any other page from my own domain but when attempting to load from anything external I'll get a connect() timeout! error.
Here's a simplified version of what I was using:
<?php
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,'http://www.google.com');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 0);
$contents = curl_exec ($ch);
curl_close ($ch);
echo $contents;
?>
Here's some info I have about my host from my phpinfo():
PHP Version 5.3.1
cURL support enabled
cURL Information 7.19.7
Host i686-pc-linux-gnu
I don't have access to SSH or modifying the php.ini file (however I can read it). But is there a way to tell if something was recently set to block cURL access to external domains? Or is there something else I might have missed?
Thanks,
Dave
I'm not aware about any setting like that, it would not make much sense.
As you said you are on a remote webserver without console access I guess that your activity has been detected by the host or more likely it caused issues and so they firewalled you.
A silent iptables DROP would cause this.
When scraping google you need to use proxies for more than a few hand full of requests and you should never abuse your webservers primary IP if it's not your own. That's likely a breach of their TOS and could even result in legal action if they get banned from Google (which can happen).
Take a look at Google rank checker that's a PHP script that does exactly what you want using CURL and proper IP management.
I can't think of anything that's causing a timeout than a firewall on your side.
I'm not sure why you're getting a connect() timeout! error, but the following line:
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 0);
If it's not set to 1, it will not return any of the page's content back into your $contents.