On Mac OS 10.9.1 with PHP 5.4.17 and curl 7.30.0, this curl request runs fine at the command line:
curl -u test:test http://localhost/protected/
But this PHP script using the curl library fails:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://localhost/protected/');
curl_setopt($ch, CURLOPT_HTTPAUTH, 'CURLAUTH_BASIC');
curl_setopt($ch, CURLOPT_USERPWD, 'test:test');
curl_setopt($ch, CURLOPT_VERBOSE, TRUE);
echo curl_exec($ch);
curl_close($ch);
The output is:
$ php -e ./test.php
* Adding handle: conn: 0x7fe1b303de00
* Adding handle: send: 0
* Adding handle: recv: 0
* Curl_addHandleToPipeline: length: 1
* - Conn 0 (0x7fe1b303de00) send_pipe: 1, recv_pipe: 0
* About to connect() to localhost port 80 (#0)
* Trying ::1...
* Connected to localhost (::1) port 80 (#0)
> GET /protected/ HTTP/1.1
Host: localhost
Accept: */*
< HTTP/1.1 401 Authorization Required
< Date: Sat, 25 Jan 2014 03:12:40 GMT
* Server Apache/2.2.24 (Unix) DAV/2 PHP/5.4.17 mod_ssl/2.2.24 OpenSSL/0.9.8y is not blacklisted
< Server: Apache/2.2.24 (Unix) DAV/2 PHP/5.4.17 mod_ssl/2.2.24 OpenSSL/0.9.8y
< WWW-Authenticate: Basic realm="Restricted Files"
< Content-Length: 401
< Content-Type: text/html; charset=iso-8859-1
<
[...]
Note that the "Authorization: Basic ..." line is missing from the request header. It works fine if I manually set a request header like this:
curl_setopt($ch, CURLOPT_HTTPHEADER, array(base64_encode('test:test')));
An older system running Mac OS 10.7.5 with PHP 5.4.11 and curl 7.21.4 correctly sends the Authorization header. I tried many different combinations of PHP (5.4.11, 5.4.17, 5.4.24, 5.5.8) and curl (7.30.0, 7.30.4), but on Mac OS 10.9.1, they all failed to send the Authorization header unless I set it manually. Why?
This is wrong:
curl_setopt($ch, CURLOPT_HTTPAUTH, 'CURLAUTH_BASIC');
^-- ^--
with the quotes, you're trying to set a string as the option. But CURL uses define()'d constants, which are NOT quoted.
Try
curl_setopt($ch, CURLOPT_HTTPAUTH, CURLAUTH_BASIC);
so you're using the actual CURL constant, not some random string that happens to LOOK like a constant.
Related
Initially I was having issues trying to figure out why php curl under browser behaves differently when I tried to execute the same script by CLI.
By turning on the CURLOPT_VERBOSE with log output and compare the result of the CLI and browser, here are the differences I've seen:
CURL Under CLI
* About to connect() to proxy localhost port 3128 (#4)
* Trying ::1...
* Connection refused
* Trying 127.0.0.1...
* Connected to localhost (127.0.0.1) port 3128 (#4)
* Establish HTTP proxy tunnel to someurl.com:443
* Server auth using Basic with user 'some_username'
> CONNECT someurl.com:443 HTTP/1.1
Host: someurl.com:443
Proxy-Connection: Keep-Alive
< HTTP/1.1 407 Proxy Authentication Required
< Mime-Version: 1.0
< Date: Fri, 11 Dec 2020 12:04:46 CST
< Via: 1.1 someotherurl.com:8080 (Cisco-WSA/12.0.1-334)
< Content-Type: text/html
< Connection: close
< Proxy-Connection: close
< Content-Length: 2109
< X-RBT-SCAR: 2.3.4.5:11517381:2000
< Proxy-Authenticate: Basic realm="Cntlm for parent"
* Authentication problem. Ignoring this.
<
* Received HTTP code 407 from proxy after CONNECT
* Connection #4 to host localhost left intact
CURL Under Browser
* About to connect() to someurl.com port 443 (#6)
* Trying 1.2.3.4...
* Connected to someurl.com (1.2.3.4) port 443 (#6)
* warning: ignoring value of ssl.verifyhost
* skipping SSL peer certificate verification
* SSL connection using TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384
* Server certificate:
* subject: C=US,ST=FL,L=Boca Raton,O=Telit IoT Platforms,OU=secureWISE,CN=someurl.com
* start date: Apr 15 21:18:15 2020 GMT
* expire date: May 15 21:18:15 2022 GMT
* common name: someurl.com
* issuer: E=support#securewise.net,CN=secureWISE CA-256,OU=SecureWISE Certificate Authority,O=ILS Technology LLC,O=Telit Wireless Solutions Inc,L=Boca Raton,ST=Florida,C=US
* Server auth using Basic with user 'some_username'
> GET /someurl HTTP/1.1
Authorization: Basic SomeAuthKey
Host: someurl.com
Accept: */*
< HTTP/1.1 200 OK
< Date: Fri, 11 Dec 2020 04:07:40 GMT
< Server: Apache-Coyote/1.1
< X-Powered-By: Undertow/1
< Set-Cookie: JSESSIONID=c2BBPwZBjGxCaH5om6unoKaI; path=/
< Set-Cookie: somekey=somevalue; path=/
< Content-Type: text/xml
< Content-Length: 125291
< Content-disposition: attachment; filename=somefilename.xml
< Vary: Accept-Encoding,User-Agent
< SWOrigin: sw_proxy
< Connection: close
<
* Closing connection 6
My initial hunch is that this has something to do with proxy (as this PC does use a proxy to go online)
And looking at the browser log, it seems as if proxy was skipped.
I've also checked the phpinfo() for both the browser and CLI, and I can see that there's proxy, http_proxy, https_proxy defined in the environment variables, as well as under $_SERVER for CLI, but not on browser, which makes me believe more that my assumption is correct.
So in order to combat this, I've tried adding the following code before the curl call:
if(isset($_SERVER['http_proxy']))
unset($_SERVER['http_proxy']);
if (isset($_SERVER['https_proxy']))
unset($_SERVER['https_proxy']);
if (isset($_SERVER['proxy']))
unset($_SERVER['proxy']);
if(isset($_ENV['http_proxy']))
unset($_ENV['http_proxy']);
if (isset($_ENV['https_proxy']))
unset($_ENV['https_proxy']);
if (isset($_ENV['proxy']))
unset($_ENV['proxy']);
curl_setopt($ch, CURLOPT_URL, $target_url);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_USERPWD, "someuser:somepass");
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch, CURLOPT_TIMEOUT, 10);
curl_setopt($ch, CURLOPT_VERBOSE, true);
$result = curl_exec($ch);
curl_close($ch);
But the verbose still shows that it still tries to go through the proxy when executed under CLI.
Any suggestion on this?
After digging around, it turns out all I had to do was to by pass the someurl.com in the /etc/cntlm.conf by including the url in the NoProxy config.
I apologize if the title is inappropriate, but I kind of could not think of a better definition for it.
I am going nuts over this problem. I have been working on collecting feeds and data via cURL for the past 5+ years and have never encountered this kind of situation. I have a large json to collect over the GET method from a remote server via HTTPS from address that looks something like this
https://private.example.com/thisDotNetEndPoint?token=bla-bla-trutj&someParam=1
someParam is changeable, and for some values with lower amount of data everything works fine, almost identical speeds to browser, but in several cases cURL always goes to tiomeout set, while in browser and from console everything works fine
PHP
My cURL is as follows:
$ch = curl_init();
$url = 'https://private.example.com/thisDotNetEndPoint?token=bla-bla-trutj&someParam=1';
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_HEADER, 0);
// I've added this user agent as it is the same as the one Chrome uses
curl_setopt($ch, CURLOPT_USERAGENT, 'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/53.0.2785.116 Safari/537.36');
curl_setopt($ch, CURLOPT_POST, 0);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 0);
// I have tried removing the SSL part below, but no difference
curl_setopt($ch, CURLOPT_SSL_CIPHER_LIST, "HIGH:!SSLv3s");
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false); // tried this with true, but no difference
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 2);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 10);
curl_setopt($ch, CURLOPT_TIMEOUT, 1200); // what ever the timeout I set the cURL always goes to timeout
curl_setopt($ch, CURLOPT_VERBOSE, true);
$response = curl_exec($ch);
if (curl_errno($ch)) {
print("cURL error: " . curl_error($ch));
print_r(curl_getinfo($ch));
} else {
print_r(json_decode($response));
}
curl_close($ch);
This is the verbose output:
* Hostname was NOT found in DNS cache
* Trying 12.34.567.89...
* Connected to private.example.com (12.34.567.89) port 443 (#0)
* successfully set certificate verify locations:
* CAfile: none
CApath: /etc/ssl/certs
* SSL connection using ECDHE-RSA-AES256-SHA384
* Server certificate:
* subject: OU=Domain Control Validated; CN=*.example.com
* start date: 2016-03-03 09:41:38 GMT
* expire date: 2018-03-04 09:52:18 GMT
* subjectAltName: private.example.com matched
* issuer: C=US; ST=Arizona; L=Scottsdale; O=Starfield Technologies, Inc.; OU=http://certs.starfieldtech.com/repository/; CN=Starfield Secure Certificate Authority - G2
* SSL certificate verify ok.
> GET /thisDotNetEndPoint?token=bla-bla-trutj&someParam=1 HTTP/1.1
User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/53.0.2785.116 Safari/537.36
Host: private.example.com
Accept: */* */
* Operation timed out after 1200001 milliseconds with 0 bytes received
* Closing connection 0
It always goes to timout whatever the timout I set, tried even setting it to 2 hours.
I've even tried adding these but no difference:
curl_setopt($ch, CURLOPT_NOSIGNAL, 1);
curl_setopt($ch, CURLOPT_LOW_SPEED_LIMIT, 1);
curl_setopt($ch, CURLOPT_LOW_SPEED_TIME, 1200);
Browser
When I enter the same url in browser the response comes back in 6-9 minutes
cURL from console
I have used the simplest command and it works in same time as browser:
$ curl -X GET -v 'https://private.example.com/thisDotNetEndPoint?token=bla-bla-trutj&someParam=1'
Verbose output:
* Hostname was NOT found in DNS cache
* Trying 12.34.567.89...
* Connected to private.example.com (12.34.567.89) port 443 (#0)
* successfully set certificate verify locations:
* CAfile: none
CApath: /etc/ssl/certs
* SSLv3, TLS handshake, Client hello (1):
* SSLv3, TLS handshake, Server hello (2):
* SSLv3, TLS handshake, CERT (11):
* SSLv3, TLS handshake, Server key exchange (12):
* SSLv3, TLS handshake, Server finished (14):
* SSLv3, TLS handshake, Client key exchange (16):
* SSLv3, TLS change cipher, Client hello (1):
* SSLv3, TLS handshake, Finished (20):
* SSLv3, TLS change cipher, Client hello (1):
* SSLv3, TLS handshake, Finished (20):
* SSL connection using ECDHE-RSA-AES256-SHA384
* Server certificate:
* subject: OU=Domain Control Validated; CN=*.example.com
* start date: 2016-03-03 09:41:38 GMT
* expire date: 2018-03-04 09:52:18 GMT
* subjectAltName: private.example.com matched
* issuer: C=US; ST=Arizona; L=Scottsdale; O=Starfield Technologies, Inc.; OU=http://certs.starfieldtech.com/repository/; CN=Starfield Secure Certificate Authority - G2
* SSL certificate verify ok.
> GET /thisDotNetEndPoint?token=bla-bla-trutj&someParam=1 HTTP/1.1
> User-Agent: curl/7.35.0
> Host: private.example.com
> Accept: */* */
>
< HTTP/1.1 200 OK
< Cache-Control: private
< Content-Type: application/json; charset=utf-8
< Server: Microsoft-IIS/8.5
< X-StackifyID: V1|b8b10c35-2649-4f67-ba6a-b5ad15ef553b|C56050|CD18|
< Set-Cookie: .ASPXANONYMOUS=looI88UVBp6Cg5tLkzVejO4CNRilhyKjMY4hFqhuO48vdVT19U8h5oisC9khFv1rOmH6Ii_lEec-9XhipEvh1UkewhufqfmlTGFsyQCaML06NVa-5-Vr_OikZb07R6pdHCeRtn9liBVJfamJmXiElA2; expires=Thu, 02-Feb-2017 20:54:18 GMT; path=/; HttpOnly
< X-AspNetMvc-Version: 5.2
< Rx-CID: ae9907d6fc394b24b6599e74ab5a668f
< Rx_RequestId: f3fff82c4de04bba90b2bbc5704ac787
< X-Powered-By: ASP.NET
< Strict-Transport-Security: max-age=31536000
< Access-Control-Allow-Origin: *
< Access-Control-Allow-Headers: rx-cid
< Date: Fri, 25 Nov 2016 10:25:00 GMT
< Content-Length: 2231472
<
[and the response is printed here]
Any ideas?
Thanks in advance.
Did you notice the difference between the console and your php verbose output? The useragent is missing in your php code. curl commandline by default adds this useragent, whereas the php-curl doesn't.
User-Agent: curl/7.35.0
Use the option CURLOPT_USERAGENT.
curl_setopt($ch, CURLOPT_USERAGENT, "Opera 11.0");
Here is my code:
$url='http://celebcrust.com/?p=15055';
$ch = curl_init();
curl_setopt($ch, CURLOPT_COOKIESESSION, TRUE);
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_HEADER, TRUE);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, FALSE);
$httpData = curl_exec($ch);
var_export($httpData);
This code as an interactive demo on phpdiffle.org.
Why is it still redirecting? I'm trying to get the redirected to URL. I set FOLLOWLOCATION to FALSE but still.
Okay, here is how I do debug these things quickly (it's not working always, but for first try to hit the rubber on the road for more contact, this normally does it):
Requirements: Curl for the commandline (available probably for every computer system on earth, visit the homepage if you don't have it yet):
-i is to list headers as well (use -I for HEAD request if too much data comes) and then -v for verbose (shows what goes where):
$ curl -iv 'http://celebcrust.com/?p=15055'
* Adding handle: conn: 0xa50260
* Adding handle: send: 0
* Adding handle: recv: 0
* Curl_addHandleToPipeline: length: 1
* - Conn 0 (0xa50260) send_pipe: 1, recv_pipe: 0
* About to connect() to celebcrust.com port 80 (#0)
* Trying 70.32.78.224...
* Connected to celebcrust.com (70.32.78.224) port 80 (#0)
> GET /?p=15055 HTTP/1.1
> User-Agent: curl/7.30.0
> Host: celebcrust.com
> Accept: */*
>
< HTTP/1.1 200 OK
HTTP/1.1 200 OK
< Date: Sat, 31 Aug 2013 14:29:54 GMT
Date: Sat, 31 Aug 2013 14:29:54 GMT
* Server Apache is not blacklisted
< Server: Apache
Server: Apache
< X-Pingback: http://celebcrust.com/xmlrpc.php
X-Pingback: http://celebcrust.com/xmlrpc.php
< X-Powered-By: PleskLin
X-Powered-By: PleskLin
< Content-Length: 159
Content-Length: 159
< Connection: close
Connection: close
< Content-Type: text/html; charset=UTF-8
Content-Type: text/html; charset=UTF-8
<
<META HTTP-EQUIV=Refresh CONTENT="0; URL=http://www.celebgossip.com/2013/04/willie-nelson-celebrates-80th-birthday-stoned-and-auditi
oning-for-gandalf-39425/">
* Closing connection 0
As this shows the server does not send a Location: header so this totally explains that you don't see one.
Instead it sends HTML in the response body that is parsed by hypertext client (webbrowser) for a Refresh: HTTP-equivalent header value.
That is not the buisness of curl. You need to add a HTML parser and check for these, I suggest DOMDocument with it's ->loadHTML() method.
I have spotted a "weird" php CURL behavior that is sending me nuts. Basically what I am doing is making a digest authenticated call with curl. Here's an extract of my code:
curl_setopt($this->c, CURLOPT_HTTPAUTH, CURLAUTH_DIGEST);
curl_setopt($this->c, CURLOPT_USERPWD, $username . ":" . $password);
It works fine and the server actually comes back with a "YES, YOU PROVIDED THE RIGHT CREDENTIALS" kind of message. Only trouble is, the raw http response is a bit odd as it includes, as a matter of fact, 2 responses instead of one. Here's what curl_exec($this->c) spits out:
HTTP/1.0 401 Unauthorized
Date: Tue, 23 Oct 2012 08:41:18 GMT
Server: Apache/2.2.20 (Ubuntu)
X-Powered-By: PHP/5.3.6-13ubuntu3.9
WWW-Authenticate: Digest realm="dynamikrest-testing",qop="auth",nonce="5086582e95104",opaque="4b24e95490812b28b3bf139f9fbc9a66"
Vary: Accept-Encoding
Content-Length: 9
Connection: close
Content-Type: text/html
HTTP/1.1 200 OK
Date: Tue, 23 Oct 2012 08:41:18 GMT
Server: Apache/2.2.20 (Ubuntu)
X-Powered-By: PHP/5.3.6-13ubuntu3.9
Vary: Accept-Encoding
Content-Length: 9
Connection: close
Content-Type: text/html
"success"
I don't get why it includes the first response from the server (the one in which it states it requires authentication).
Can anyone throw some light on the issue? How do I avoid the responses' cumulation?
Cheers
It looks like curl has the same behavior if you use the -I option for headers:
curl -I --digest -u root:somepassword http://localhost/digest-test/
returns:
HTTP/1.1 401 Authorization Required
Date: Fri, 31 May 2013 13:48:35 GMT
Server: Apache/2.2.22 (Ubuntu)
WWW-Authenticate: Digest realm="Test Page", nonce="9RUL3wPeBAA=52ef6531dcdd1de61f239ed6dd234a3288d81701", algorithm=MD5, domain="/digest-test/ http://localhost", qop="auth"
Vary: Accept-Encoding
Content-Type: text/html; charset=iso-8859-1
HTTP/1.1 200 OK
Date: Fri, 31 May 2013 13:48:35 GMT
Server: Apache/2.2.22 (Ubuntu)
Authentication-Info: rspauth="4f5f8237e9760f777255f6618c21df4c", cnonce="MTQ3NDk1", nc=00000001, qop=auth
Vary: Accept-Encoding
Content-Type: text/html;charset=UTF-8
X-Pad: avoid browser bug
To only get the second header you could try this (not very optimal solution):
<?php
$ch = curl_init();
// set url
curl_setopt($ch, CURLOPT_URL, "http://localhost/digest-test/");
curl_setopt($ch, CURLOPT_HTTPAUTH, CURLAUTH_DIGEST);
curl_setopt($ch, CURLOPT_USERPWD, "root:test");
// first authentication with a head request
curl_setopt($ch, CURLOPT_NOBODY, 1);
curl_exec($ch);
// the get the real output
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_HEADER, 1);
curl_setopt($ch, CURLOPT_HTTPGET, 1);
$output = curl_exec($ch);
echo $output;
I hit the same problem, and I think it was caused by PHP being compiled against an ancient version of libcurl (7.11.0 in my case, which is now nearly 10 years old). On a different machine with a more recent version of libcurl (7.29.0) the same code was fine, and my problems ended after getting my host to recompile their PHP to use the latest they had available (7.30.0).
This fix was suggested by a thread on the curl-library mailing list from 2008, where a user discovered the problem affected version 7.10.6 but not 7.12.1. I've searched the libcurl changelog around 7.12.0 and failed to find any clear entry about fixing this problem, though it might be covered by "general HTTP authentication improvements". Still, I'm now pretty confident that an old libcurl is the problem.
You can check which version of libcurl is used by your PHP from the 'cURL Information' entry in the output of phpinfo();
--- Update at the bottom, it's related to CURLOPT_COOKIE --
I'm developping on my local machine ( 192.168.1.103 ), and I have a PHP script that makes a CURL call to get the header and the content returned by a remote script.
I've installed 2 copies of the remote script that must return his content:
- One on my local machine, under the same virtual host. ( http://192.168.1.103/test/output_script.php )
- One on a remote server. ( http://site.com/text/outputscript.php )
The CURL script works really well when I try to get the content from the remote server, but completly timeout when trying to get the content from the local server.
The verbose of the PHP CURL is:
* About to connect() to 192.168.1.103 port 80 (#0)
* Trying 192.168.1.103... * connected
* Connected to 192.168.1.103 (192.168.1.103) port 80 (#0)
> GET /app/getContent HTTP/1.1
Host: 192.168.1.103
Accept: */*
Cookie: PHPSESSID=u8spbervheh3tcrv62gcnc2j72
* Operation timed out after 5001 milliseconds with 0 bytes received
* Closing connection #0
Note that the URI is rewrited with the following .htaccess file (on both location):
RewriteEngine on
RewriteBase /cms/client1/public_html
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule .* index.php [L]
Also note that I've activated the rewrite log and compared request to make sure that the mod_rewrite action was exactly the same in all situation. ( I'm 100% sure it's not a rewrite trouble )
If I try to get the file using the CURL app under Ubuntu, it works well:
$ curl -v --cookie PHPSESSID=u8spbervheh3tcrv62gcnc2j72 http://192.168.1.103/app/getContent
* About to connect() to 192.168.1.103 port 80 (#0)
* Trying 192.168.1.103... connected
* Connected to 192.168.1.103 (192.168.1.103) port 80 (#0)
> GET /app/getContent HTTP/1.1
> User-Agent: curl/7.21.0 (i686-pc-linux-gnu) libcurl/7.21.0 OpenSSL/0.9.8o zlib/1.2.3.4 libidn/1.18
> Host: 192.168.1.103
> Accept: */*
> Cookie: PHPSESSID=u8spbervheh3tcrv62gcnc2j72
>
< HTTP/1.1 403 Forbidden
< Date: Thu, 24 Feb 2011 21:40:17 GMT
< Server: Apache/2.2.16 (Ubuntu)
< X-Powered-By: PHP/5.3.3-1ubuntu9.3
< Expires: Thu, 19 Nov 1981 08:52:00 GMT
< Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
< Pragma: no-cache
< Vary: Accept-Encoding
< Content-Length: 82
< Content-Type: text/html; charset=UTF-8
<
* Connection #0 to host 192.168.1.103 left intact
* Closing connection #0
WT_AUTH non défini. (strictement aucune authentification actuellement en session)
The 403 error and the WT_AUTH content is what I expect to receive instead of the timeout that I have with PHP.
It's also the same (wanted & correct) result that I receive if use the php curl on the remote server:
* About to connect() to site.com port 80 (#0)
* Trying 123.123.123.123... * connected
* Connected to site.com (123.123.123.123) port 80 (#0)
> GET /app/getContent HTTP/1.1
Host: site.com
Accept: */*
Cookie: PHPSESSID=u8spbervheh3tcrv62gcnc2j72
< HTTP/1.1 403 Forbidden
< Date: Thu, 24 Feb 2011 21:45:30 GMT
< Server: Apache/2.2.16 (Debian) DAV/2 SVN/1.6.12 mod_fcgid/2.3.6
< Expires: Thu, 19 Nov 1981 08:52:00 GMT
< Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
< Pragma: no-cache
< Content-Length: 28
< Content-Type: text/html; charset=UTF-8
<
* Connection #0 to host site.com left intact
* Closing connection #0
And I'll also get the same thing if I access directly 192.168.1.103/app/getContent in my browser.
Finally, I've also made sure that the getContent script was working by putting logs in it. The weird part is that if I start the request at 16:45:00, and the timeout occur at 16:45:05, the logged data from the getContent script will be dated at 16:45:05. So it's like if the CURL was maintaining a connexion in the "opening" state. And when the connexion is closed, the php script is allowed to start.
Any idea of my it doesn't work locally ?
In case you want to take a look at the PHP code, here's the pertinent part:
$ressource = curl_init();
curl_setopt($ressource, CURLOPT_URL, $destinationUrl);
curl_setopt($ressource, CURLOPT_VERBOSE, true);
$handle = fopen(FRAMEWORK_ROOT . DIRECTORY_SEPARATOR . 'log' . DIRECTORY_SEPARATOR . 'curl_debug.txt', 'w');
curl_setopt($ressource, CURLOPT_STDERR, $handle);
// Turn off the server and peer verification (TrustManager Concept).
curl_setopt($ressource, CURLOPT_SSL_VERIFYPEER, FALSE);
curl_setopt($ressource, CURLOPT_SSL_VERIFYHOST, FALSE);
curl_setopt($ressource, CURLOPT_RETURNTRANSFER, TRUE); //retourn content
curl_setopt($ressource, CURLOPT_HEADER, TRUE); //get HTTP headers
curl_setopt($ressource, CURLOPT_COOKIE, session_name() . '=' . session_id());
curl_setopt($ressource, CURLOPT_TIMEOUT, 5);
echo "\n<br />" . date('Y/m/d H:i:s');
$httpResponse = curl_exec($ressource);
echo "\n<br />" . date('Y/m/d H:i:s');
if(curl_errno($ressource) != 0)
throw new Core_Exc_Def(curl_error($ressource)); // WILL THROW AN ERROR ON 192.168.1.103, BUT NOT ON THE REMOTE SITE.
Funny fact: before adding the TIMEOUT, the loading was infinite. The local site wasn't responding, even other pages. I needed to restart the apache server to be able to access the site again...
Update:
If I comment the line:
curl_setopt($ressource, CURLOPT_COOKIE, session_name() . '=' . session_id());
It's "working" (it cause another problem, but nothing related to the timeout).
Both script are on the same virtual host, and share the same session, but that should not create a CURL TimeOut ?!
It happens because sessions are locked for writing. When you try to connect with your script to the same server with the same session_id, the second script waits until that session lock is released.
You need to change the session_id that you're sending in the request:
Change:
curl_setopt($ressource, CURLOPT_COOKIE, session_name() . '=' . session_id());
To:
curl_setopt($ressource, CURLOPT_COOKIE, session_name() . '=' . md5(session_id() . mktime()));