Ok, I am having a hard time simply trying to get contents from Go Daddy Host server to our company's proprietary server. Originally I was using file_get_contents, then I searched all over SO and realized curl was a better option to bypass security and configuration. Here is my code:
function get_content($URL){
$ch = curl_init();
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_URL, $URL);
$data = curl_exec($ch);
if(curl_errno($ch)){
echo 'Curl error: ' . curl_error($ch);
}
curl_close($ch);
return $data;
}
echo 'curl:' . get_content('https://xxx-xxxxx:4032/test2.html');
Here is the error:
Curl error: Failed to connect to xxx-xxxxx.com port 4032: Connection refused
Here are some facts:
If I enter the URL into my browser, I will be able to retreive test2.html
If execute the EXACT script on a different web host (Lunar Pages), then it will work perfectly fine
get_content() will work on google.com
Go Daddy representatives cannot help us
On our server, we've disabled the firewall (while we tested this)
I would have posted this with a comment, but I don't have enough upvotes to do that. GoDaddy is one of the worst hosts for custom code. Sure they're good for things like WordPress, but if you're wanting custom functionality within your code, they're one of the worst.
This is just an example, GoDaddy blocks most file_get_contents and cURL calls within their firewall. I would go with a host like HostGator or Digital Ocean... Both are cheap but not near as limiting.
Before making a switch, I would try to run this same code on another environment locally and make sure you can connect.
Related
I am trying to use Luminati.oi proxy service to crawl URLs but there seems to be a problem with my server connecting to the proxy and utilizing the CURLOPT_PROXY functionality.
$curl = curl_init('http://lumtest.com/myip.json');
curl_setopt($curl, CURLOPT_PROXY, 'http://example:24000');
curl_exec($curl);
$result = curl_exec($curl); echo $result;
echo 'Curl error: ' . curl_error($curl);
curl_close($curl);
I get no response at all from the target URL and the curl_error() function returns Curl error: Failed to connect to exmaple port 24000: Connection refused there are several variations on how to compose the CURL that Luminati.io provides, none seem to work.
If I remove the CURLOPT_PROXY option and just send the request direct from my server with no proxy, it works just fine and I get the correct response back from the target URL. So my server seems okay with CURL just not the proxy function.
The URL and port for the proxy server at luminati.io seem to work fine when using the desktop proxy manager. So the proxy service seems to work okay, the target URL is good and my server can use CURL with no problem, so it seems the issue is isolated around the CURLOPT_PROXY not working. The good people over at Luminati.io think there is some type of server setting or firewall in my Apache Linux server that wont allow the proxy connection to occur, I can't find this setting anywhere and Hostgator seems useless and apathetic when asked.
So I'm hoping someone can provide some greater insight into why CURLOPT_PROXY function wont work.
You can ask Hostgator to give you access to port 22225. I think this is the correct port used by Luminati.
This is how you can contact Hostgator: http://support.hostgator.com/articles/open-new-ports
I have Digitalocean LEMP application dropplet configured to host my website (PHP - CodeIgniter application), I'm using Nexmo SMS API in my application and everything worked well until today. Suddenly, now when I try to send SMS I have this error "Could not resolve host: rest.nexmo.com". Looks like cURL is not working, but when I checked if it is installed - it is, and my phpinfo shows it as well. I restarted nginx, php5-fpm, tried some different curl settings in my code but always got this error.
When I tried to run simple script like this:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://www.google.com/');
curl_setopt($ch, CURLOPT_VERBOSE, true);
curl_setopt($ch, CURLOPT_HEADERFUNCTION, 'read_header');
curl_setopt($ch, CURLOPT_WRITEFUNCTION, 'read_body');
curl_exec($ch);
if ($error = curl_error($ch)) {
echo "Error: $error<br />\n";
}
function read_header($ch, $string)
{
$length = strlen($string);
echo "Received $length bytes<br />\n";
return $length;
}
result is still "Error: Could not resolve host: www.google.com" so i think the problem is in cURL and not in my application code (i'm using NEXMO CodeIgniter library).
I tried everything that comes in my mind and now I'm running out of ideas, so every help is appreciated. Are there any special settings/things to do to connect/make cURL works with nginx that I'm missing? Maybe something in nginx.conf files, or do I need to open some ports, etc?
Note that the error is "Could not resolve host". This points not to curl, but to the system resolver library. Curl does not, by itself, do DNS lookups, but instead uses the system standard methods, usually using libresolv. If you use a system call trace utility like strace you will see that the resolver is then controlled by /etc/nsswitch.conf, /etc/host.conf and /etc/resolv.conf. Your first point of call should be there. You can test that it is system and not curl by using a standard PHP file call, like:
$web_content = file_get_contents("http://www.google.com/");
This should return the body if it can resolve the host.
I created a small web app, with the feature of being able to Sign in with a Twitter account, by connecting with OAuth. It worked like charm for a few months, but now it stopped working. This is a quick and summarized overview "Sign in with twitter" algorithm.
Collecting some parameters (timestamp, nonce, some kind of application ID and so)
Use this parameters to create a new URL that looks like this:
https://api.twitter.com/oauth/request_token?oauth_consumer_key=DLZZTIxpY19FnWJNtqw5A&oauth_nonce=1369452195&oauth_signature=DIetumiKqJu66XXVvDDHdepnP9M%3D&oauth_signature_method=HMAC-SHA1&oauth_timestamp=1369452195&oauth_version=1.0
Connect to that URL and retrieve the data, it has an access token
Continue doing fun stuff using that access token.
The URL generated in step 2 is fine because I tried manually copying it in Google Chrome, and it shows a beautiful access token, so the problem isn't there (I think).
In the step 3, I have a really small method that should do some very basic stuff: Connect to the URL generated before, retrieve the content and return it.
In my localhost Using EasyPHP 12.1, it works perfectly, as usual, but in the free host that i'm using (000webhost) it doesn't work anymore. When trying to connect, it just timeouts. The HTTPCodeError is 0 and the CurlError is "Couldn't connect to host".
This is the method used to connect to the URL.
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 30);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, FALSE);
$http_status = curl_getinfo($ch, CURLINFO_HTTP_CODE);
$response = curl_exec($ch);
curl_close($ch);
return $response;
And this is an example of a URL used with that method:
https://api.twitter.com/oauth/request_token?oauth_consumer_key=DLZZTIxpY19FnWJNtqw5A&oauth_nonce=1369452195&oauth_signature=DIetumiKqJu66XXVvDDHdepnP9M%3D&oauth_signature_method=HMAC-SHA1&oauth_timestamp=1369452195&oauth_version=1.0
I've been trying to fix it all day long, but now I have no idea even of what to try. The files are the very same in my localhost and in the 000webhost.
If you could enlighten me I would be very happy. I'll take my pants off for answers if it's needed. Thank you very much.
It might be possible that Twitter has blacklisted (blocked) your free host's servers or IP address. This can happen if other users on the server abuse the API.
The only thing I can think of is that your free web hosting service blocks it. You know these services are perfect as long as everything is very simple. The moment things become complicated you run into restrictions implemented by the provider. Most of these services limit band width, disk space, server use, support, uploading and more.
I send an item code to a web service in xml format using cUrl(php). I get the correct response in localhost, but when do it server it shows
cURL Error (7): couldn't connect to host
And here's my code:
function xml_post($post_xml, $url)
{
$user_agent = $_SERVER['HTTP_USER_AGENT'];
$ch = curl_init(); // initialize curl handle
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, FALSE);
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_FAILONERROR, 1);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_RETURNTRANSFER,1);
curl_setopt($ch, CURLOPT_TIMEOUT, 50);
curl_setopt($ch, CURLOPT_POSTFIELDS, $post_xml);
curl_setopt($ch, CURLOPT_USERAGENT, $user_agent);
// curl_setopt($ch, CURLOPT_PORT, $port);
$data = curl_exec($ch);
$curl_errno = curl_errno($ch);
$curl_error = curl_error($ch);
if ($curl_errno > 0) {
echo "cURL Error ($curl_errno): $curl_error\n";
} else {
echo "Data received\n";
}
curl_close($ch);
echo $data;
}
I send the item code to the tally and fetch the details from it. I tried using both the versions php 4+ and php5+, nothing works out Any solution.
CURL error code 7 (CURLE_COULDNT_CONNECT)
is very explicit ... it means Failed to connect() to host or proxy.
The following code would work on any system:
$ch = curl_init("http://google.com"); // initialize curl handle
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
$data = curl_exec($ch);
print($data);
If you can not see google page then .. your URL is wrong or you have some firewall or restriction issue.
“CURL ERROR 7 Failed to connect to Permission denied” error is caused, when for any reason curl request is blocked by some firewall or similar thing.
you will face this issue when ever the curl request is not with standard ports.
for example if you do curl to some URL which is on port 1234, you will face this issue where as URL with port 80 will give you results easily.
Most commonly this error has been seen on CentOS and any other OS with ‘SElinux’.
you need to either disable or change ’SElinux’ to permissive
have a look on this one
http://www.akashif.co.uk/php/curl-error-7-failed-to-connect-to-permission-denied
Hope this helps
If you have tried all the ways and failed, try this one command:
setsebool -P httpd_can_network_connect on
In PHP, If your network under proxy. You should set the proxy URL and port
curl_setopt($ch, CURLOPT_PROXY, "http://url.com"); //your proxy url
curl_setopt($ch, CURLOPT_PROXYPORT, "80"); // your proxy port number
This is solves my problem
In my case I had something like cURL Error (7): ... Operation Timed Out. I'm using the network connection of the company I'm working for. I needed to create some environment variables. The next worked for me:
In Linux terminal:
$ export https_proxy=yourProxy:80
$ export http_proxy=yourProxy:80
In windows I created (the same) environment variables in the windows way.
I hope it helps!
Regards!
Are you able to hit that URL by browser or by PHP script? The error shown is that you could not connect. So first confirm that the URL is accessible.
Check if port 80 and 443 are blocked. or enter - IP graph.facebook.com and enter it in etc/hosts file
you can also get this if you are trying to hit the same URL with multiple HTTP request at the same time.Many curl requests wont be able to connect and so return with error
This issue can also be caused by making curl calls to https when it is not configured on the remote device. Calling over http can resolve this problem in these situations, at least until you configure ssl on the remote.
In my case, the problem was caused by the hosting provider I was using blocking http packets addressed to their IP block that originated from within their IP block. Un-frickin-believable!!!
For a couple of days I was totally blocked on this. I'm very very new to networking/vms but was keen to try set it up myself instead of paying a hosting company to do it for me.
Context
I'm rebuilding the server side for an app that uses php routines to return various bits of data from internal sources as well as external APIs for a map based app. I have started an Oracle VM instance and have installed/set up Apache and php. All running totally fine, until one of my php routines tries to execute a cURL. I start implementing error logging to find that I don't even get a message - just '7', despite implementation being very similar to the above. My php routine accessing an internal file for data was running successfully so I was fairly sure it wasn't an Apache or php issue. I also checked my Apache error logs, nothing telling.
Solution
I nearly gave up - there's talk on disabling SELinux above and in other articles, I tried that and it did work for my purposes, but here's a really good article on why you shouldn't disable SELinux https://www.electronicdesign.com/technologies/embedded-revolution/article/21807408/dont-do-it-disabling-selinux
If temporarily disabling it works and like me you don't want to do this (but it confirms that SELinux is blocking you!), I found a neat little command that actually prints out any SELinux issues in a more readable fashion:
sealert -a /var/log/audit/audit.log
This returned the following:
found 1 alerts in /var/log/audit/audit.log
--------------------------------------------------------------------------------
SELinux is preventing php-fpm from name_connect access on the tcp_socket port 443.
Great, I now get a bit more information than just '7'. Reading further down, I can see it actually makes suggestions:
***** Plugin catchall_boolean (24.7 confidence) suggests ******************
If you want to allow httpd to can network connect
Then you must tell SELinux about this by enabling the 'httpd_can_network_connect' boolean.
Do
setsebool -P httpd_can_network_connect 1
This has been mentioned further above but now I have a bit more context and an explanation as to what it does. I run the command, and I'm in business. Furthermore, my SELinux is still set to enforcing, meaning my machine is more secure.
There are many other suggestions logged out, if you're blocked it might be worth logging out/checking out /var/log/audit/audit.log.
I was using cURL to scrape content from a site and just recently my page stated hanging when it reached curl_exec($ch). After some tests I noticed that it could load any other page from my own domain but when attempting to load from anything external I'll get a connect() timeout! error.
Here's a simplified version of what I was using:
<?php
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,'http://www.google.com');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 0);
$contents = curl_exec ($ch);
curl_close ($ch);
echo $contents;
?>
Here's some info I have about my host from my phpinfo():
PHP Version 5.3.1
cURL support enabled
cURL Information 7.19.7
Host i686-pc-linux-gnu
I don't have access to SSH or modifying the php.ini file (however I can read it). But is there a way to tell if something was recently set to block cURL access to external domains? Or is there something else I might have missed?
Thanks,
Dave
I'm not aware about any setting like that, it would not make much sense.
As you said you are on a remote webserver without console access I guess that your activity has been detected by the host or more likely it caused issues and so they firewalled you.
A silent iptables DROP would cause this.
When scraping google you need to use proxies for more than a few hand full of requests and you should never abuse your webservers primary IP if it's not your own. That's likely a breach of their TOS and could even result in legal action if they get banned from Google (which can happen).
Take a look at Google rank checker that's a PHP script that does exactly what you want using CURL and proper IP management.
I can't think of anything that's causing a timeout than a firewall on your side.
I'm not sure why you're getting a connect() timeout! error, but the following line:
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 0);
If it's not set to 1, it will not return any of the page's content back into your $contents.