I'm importing some data from external domains successful with cURL, until I tried with this URI: http://www.airbnb.com/calendar/ical/760186.ics?s=29623a93eb0e693c77591a711f082f06, which is a ics calendar.
I can successfully run it on a command line (try for your selves):
shell>> curl https://www.airbnb.com/calendar/ical/760660.ics?s=593cc556438a8f0919beb6107b6f508d, so it's not a network issue.
but my php script (that do return other URI) DO NOT return this. or better it return false.
here is the small php
function file_get_contents_curl($url) {
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_HEADER, FALSE);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, TRUE);
$data = curl_exec($ch);
curl_close($ch);
return $data;
}
$ical1= "http://www.airbnb.com/calendar/ical/760186.ics?s=29623a93eb0e693c77591a711f082f06";
echo file_get_contents_curl($ical1);
I do think this has something to do with my apache or php configuration, because it runs on appfog and it run with my old xampp instalation.
To resume: all URI worked with old xampp instalation and now just the one from the example fails.
on my phpinfo() i can read:
cURL support enabled
cURL Information 7.24.0
Age 3
Features
AsynchDNS Yes
Debug No
GSS-Negotiate Yes
IDN No
IPv6 Yes
Largefile Yes
NTLM Yes
SPNEGO No
SSL Yes
SSPI Yes
krb4 No
libz Yes
CharConv No
Protocols dict, file, ftp, ftps, gopher, http, https, imap, imaps, ldap, pop3, pop3s, rtsp, scp, sftp, smtp, smtps, telnet, tftp
Host i386-pc-win32
SSL Version OpenSSL/1.0.1c
ZLib Version 1.2.5
libSSH Version libssh2/1.3.0
Try this :
function get_remoteDATA($url){
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
//If POST METHOD IS NEEDED
//curl_setopt($ch, CURLOPT_POST, TRUE);
//curl_setopt($ch, CURLOPT_POSTFIELDS, "var1=1&var2=2&var3=3");
curl_setopt($ch, CURLOPT_USERAGENT, "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.0)");
//curl_setopt($ch, CURLOPT_USERAGENT, "Mozilla/5.0 (iPhone; U; CPU like Mac OS X; en) AppleWebKit/420+ (KHTML, like Gecko) Version/3.0 Mobile/1C25 Safari/419.3");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST,false);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER,false);
curl_setopt($ch, CURLOPT_MAXREDIRS, 10);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 9);
$data = curl_exec($ch);
curl_close($ch);
return $data;
}
$cntn= get_remoteDATA('http://www.airbnb.com/calendar/ical/760186.ics?s=29623a93eb0e693c77591a711f082f06');
print_r($cntn);
Related
I have a php 7.4 script that downloads a zip file using cURL. Both servers are
Apache/2.4.51 (Fedora)
Fedora 35
OpenSSL version 1.1.11
If I use CURL_HTTP_VERSION_1_0 all works. CURL_HTTP_VERSION_2_0 does not. Apache on the server I am calling has protocol h2 set. Below are the pertinent lines of code.
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_HTTP_VERSION, CURL_HTTP_VERSION_1_0); // this is where I change to ver 2
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_USERAGENT, "Mozilla/5.0 (Macintosh; U; Intel Mac OS X; en-US; rv:1.8.1) Gecko/20061024 BonEcho/2.0");
$html = curl_exec($ch);
the error I get using CURL_HTTP_VERSION_2_0 is Curl Error: transfer closed with 4 bytes remaining to read
Also, I can successfully cURL from the cli to the server from the same box the script is on with --http2.
What else should I try? Is there other info I should post to help answer?
EDIT: Is it possible the Content-Length header is being incorrectly set on the sending side?
I'm trying to retrieve the contents of a URL: https://www.cyber.gov.au/.
If I use wget or curl from the command line, all is fine. The response is almost instant.
$ wget https://www.cyber.gov.au/
--2020-11-17 08:47:12-- https://www.cyber.gov.au/
Resolving www.cyber.gov.au (www.cyber.gov.au)... 92.122.153.122, 92.122.153.201
Connecting to www.cyber.gov.au (www.cyber.gov.au)|92.122.153.122|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 41951 (41K) [text/html]
Saving to: ‘index.html’
index.html 100%[=========================================>] 40.97K --. KB/s in 0.002s
2020-11-17 08:47:13 (18.8 MB/s) - ‘index.html’ saved [41951/41951]
However, when I try to connect to the same URL through PHP curl, it times out with the message:
Operation timed out after 5001 milliseconds with 0 bytes received
I've reduced this to a test case:
$handle = curl_init('https://www.cyber.gov.au/');
curl_setopt($handle, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($handle, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($handle, CURLOPT_CONNECTTIMEOUT, 5);
curl_setopt($handle, CURLOPT_TIMEOUT, 5);
curl_setopt($handle, CURLOPT_USERAGENT, 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/86.0.4240.198 Safari/537.36');
$output = curl_exec($handle);
echo $output;
curl_close($handle);
I also tried with various combinations of these additional curl settings, with no change:
curl_setopt($handle, CURLOPT_FRESH_CONNECT, true);
curl_setopt($handle, CURLOPT_IPRESOLVE, CURL_IPRESOLVE_V4); // Also tried specifying v6
curl_setopt($handle, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($handle, CURLOPT_SSL_VERIFYHOST, 0);
It doesn't seem to be the DNS resolution time:
echo curl_getinfo($handle, CURLINFO_NAMELOOKUP_TIME); // 0.012 seconds
I've tried this on different machines, with different versions of PHP (7.2.12 and 7.4.10), and I get the same behaviour. Other URLs, both HTTP and HTTPS, work as expected. I get the same on CLI PHP as through Apache. Trying file_get_contents() gives a similar result, it just times out. Adding verbose curl logging didn't provide any more information.
curl --version gives curl 7.47.0 and curl 7.58.0 on the machines I tested on.
Can anyone spot what's going on or point me in the right direction to find out more about the problem?
I'm migrating from PHP 5.3 to PHP 5.6.3.
When doing a curl request, it now gives me an error:
The requested URL returned error: 411 Length Required
All the data is the same. A vardump on the msg returns a string of 620. The length is correct, nothing of the variables are changed and contain the same data whether it is PHP 5.3 or PHP 5.6.3.
So in short; I have exact the same setup, data, format etc. but now curl gives a 411 error.
This is a snippet of my code:
$handle = curl_init();
curl_setopt($handle, CURLOPT_URL, $this->_url);
curl_setopt($handle, CURLOPT_FAILONERROR, true);
curl_setopt($handle, CURLOPT_HTTPHEADER, array("Content-Type: text/xml","SOAPAction: ".$request."","Content-length: ".strlen($msg)));
curl_setopt($handle, CURLOPT_RETURNTRANSFER, true);
curl_setopt($handle, CURLOPT_POSTFIELDS, $msg);
curl_setopt($handle, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($handle, CURLOPT_SSL_VERIFYPEER, 0);
curl_setopt($handle, CURLOPT_HTTPAUTH, CURLAUTH_NTLM);
curl_setopt($handle, CURLOPT_USERPWD, $this->_httpUser.':'.$this->_httpPassword);
$response = curl_exec($handle);
PHPINFO curl:
cURL support enabled
cURL Information 7.39.0
Age 3
Features
AsynchDNS Yes
CharConv No
Debug No
GSS-Negotiate No
IDN Yes
IPv6 Yes
krb4 No
Largefile Yes
libz Yes
NTLM Yes
NTLMWB No
SPNEGO Yes
SSL Yes
SSPI Yes
TLS-SRP No
Protocols dict, file, ftp, ftps, gopher, http, https, imap, imaps, ldap, pop3, pop3s, rtsp, scp, sftp, smtp, smtps, telnet, tftp
Host i386-pc-win32
SSL Version OpenSSL/1.0.1i
ZLib Version 1.2.7.3
libSSH Version libssh2/1.4.3
Ok, I had to remove the content-length part since curl does this automatically for non 'put' requests.
The reason why my code worked under PHP 5.3 and not with PHP 5.6.3 is unknown to me. Yet this is the solution.
I have script which gets image from http and it works fine, but when I try to get image from https link it is not working. I am using curl, and I have found this Get image via https and php but it is not working.
My code:
$image_url="https://ssl.gstatic.com/accounts/services/mail/phone.png";
$slika = getSslPage($image_url);
if(!empty($slika))
file_put_contents('vest.jpg', $slika);
function getSslPage($url) {
$ch = curl_init();
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch, CURLOPT_HEADER, false);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, false);
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_REFERER, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$result = curl_exec($ch);
curl_close($ch);
return $result;
}
phpinfo is giving me this information
Protocols: dict, file, ftp, gopher, http, imap, pop3, rtsp, smtp, telnet, tftp
and command line is showing https
root#server [~]# curl --version
curl 7.19.7 (x86_64-redhat-linux-gnu) libcurl/7.19.7 NSS/3.16.2.3 Basic ECC zlib/1.2.3 libidn/1.18 libssh2/1.4.2
Protocols: tftp ftp telnet dict ldap ldaps http file https ftps scp sftp
I have solved it by installing curl ssl
/scripts/easyapache
option 7 on the menu
select PHP
scroll down and select CURL with SSL
exit
save
Everything is working now
When I attempt to use PHP's cURL methods for SOME URLs, it times out. When I use the commandline for the same URL, it works just fine.
I am using AWS and have a t2.medium box running the php-55 apache libraries from yum.
Here is my PHP code:
function curl($url) {
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_AUTOREFERER, true);
curl_setopt($ch, CURLOPT_USERAGENT, 'Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2228.0 Safari/537.36');
curl_setopt($ch, CURLOPT_HEADER, true);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch, CURLOPT_VERBOSE, true);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_MAXREDIRS, 2);
curl_setopt($ch, CURLOPT_HTTPHEADER, array(
'Accept-Language: en-us'
));
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 5);
curl_setopt($ch, CURLOPT_TIMEOUT, 10);
curl_setopt($ch, CURLOPT_IPRESOLVE, CURL_IPRESOLVE_V4);
$fh = fopen('/home/ec2-user/curllog', 'w');
curl_setopt($ch, CURLOPT_STDERR, $fh);
$a = curl_exec($ch);
curl_close($ch);
fclose($fh);
$headers = explode("\n",$a);
var_dump($headers);
var_dump($a);
exit;
return $result;
}
So here is call that works just fine:
curl('http://www.google.com');
And this returns the data for the homepage of google.
However, I try another URL:
curl('http://www.trulia.com/profile/agent-1391347/overview');
And I get this in the curllog:
[ec2-user#central Node]$ cat ../curllog
* Hostname was NOT found in DNS cache
* Trying 23.0.160.99...
* Connected to www.trulia.com (23.0.160.99) port 80 (#0)
> GET /profile/agent-1391347/overview HTTP/1.1
User-Agent: Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2228.0 Safari/537.36
Host: www.trulia.com
Accept: */*
Accept-Language: en-us
* Operation timed out after 10002 milliseconds with 0 bytes received
* Closing connection 0
If I run this from the command line:
curl -s www.trulia.com/profile/agent-1391347/overview
It IMMEDIATELY returns (within 1 second) with NO output. This is expected. However when I run this:
curl -sL www.trulia.com/profile/agent-1391347/overview
It returns the page properly, just as I would want.
So, what is wrong with my curl?
PHP 5.5.20
Here is the cURL bit from my phpinfo():
curl
cURL support => enabled
cURL Information => 7.38.0
Age => 3
Features
AsynchDNS => Yes
CharConv => No
Debug => No
GSS-Negotiate => No
IDN => Yes
IPv6 => Yes
krb4 => No
Largefile => Yes
libz => Yes
NTLM => Yes
NTLMWB => Yes
SPNEGO => Yes
SSL => Yes
SSPI => No
TLS-SRP => No
Protocols => dict, file, ftp, ftps, gopher, http, https, imap, imaps, ldap, ldaps, pop3, pop3s, rtsp, scp, sftp, smtp, smtps, telnet, tftp
Host => x86_64-redhat-linux-gnu
SSL Version => NSS/3.16.2 Basic ECC
ZLib Version => 1.2.7
libSSH Version => libssh2/1.4.2
I have checked your function curl() It seems fine. No need to change anything in the function. What should you need to do is just pass the URL as it is as parameter no need to change HTTPS to HTTP
curl('http://www.trulia.com/profile/agent-1391347/overview');
Reason:
You already told curl to don't verify the SSL
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
Let me know if you need any explanation.
The verbose output shows a clear timeout problem:
Operation timed out after 10002 milliseconds with 0 bytes received
This signals a problem with your network setup. These are harder to locate, this can be on your own end (e.g. in context of the webserver or the PHP executable) or on the other end. Both places are possible to a certain extend, however the server accepts both requests even if they have different request headers, so it is more likely that this is execution context related which is also how you generally describe it.
Check if there are any restrictions on security and other networking layers regarding performing those requests via PHP. E.g. try a different server image if you're not so into system administration and trouble-shooting. From what is shared in your question, this is hard to say what exactly causes your timeout.
Try increasing the timeout values in the following lines:
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 5);
curl_setopt($ch, CURLOPT_TIMEOUT, 10);
Those are pretty short timeout values - the CURLOPT_TIMEOUT specifically limits the entire execution time, try giving larger values:
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 15);
curl_setopt($ch, CURLOPT_TIMEOUT, 30);
You have 2 VARIABLES
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 5);
curl_setopt($ch, CURLOPT_TIMEOUT, 10);
The first one, CURLOPT_CONNECTTIMEOUT is maximum amount of time allowed to make connection to the server`
You can disable it by setting it to 0.
That is
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 0);
But it is not a good method if you are in a production environment because it will never timeout.
Now CURLOPT_TIMEOUT
From PHP Documentation
The maximum number of seconds to allow cURL functions to execute.
Set it to some higher value
curl_setopt($ch, CURLOPT_TIMEOUT, 20); // 20 Seconds.