I am managing a beta website in which just a handful of trial users (3 out of 20) are unable to see the content on certain pages. Every users is using a different computer, on different networks. I've made sure that the 3 users have JavaScript enabled, but that does not solve the issue.
Delving deeper, it looks like the lack of content is due to a cURL call failing; for these users, it appears that curl_exec() is either failing, or returning no data, which in turn returns a null result, which is why they are not seeing any info. For these 3 users, this problem exists across 3 separate browsers (Chrome, IE, Firefox). I have verified that it is not due to any user settings, like user ID, since I can log into their accounts from multiple computers on different networks and get a response.
I think the culprit may be the way their machines/networks handle SSL certs. I am not well versed in SSL certs, so please excuse my ignorance or glaring mistakes in some areas. However, process of elimination leads me to believe that the website cert is being rejected by the networks/machines of these 3 users, and because of this the cURL call is failing. My questions are how can I verify that this is the case, what would be the cause if so, and how can I resolve the problem?
The website is built with PHP using the Zend frameowrk.
Here is the relevant bit of code:
$ssl_cert = ZendGA_Global::getOption('curl_ssl_certificate');
$ssl_verify_peer = ZendGA_Global::getOption('curl_ssl_verify_peer');
$ssl_verify_host = ZendGA_Global::getOption('curl_ssl_verify_host');
//open connection
$ch = curl_init();
//set the url, number of POST vars, POST data
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
// related to SSL cerificate verification
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, true);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, $ssl_verify_host);
curl_setopt($ch, CURLOPT_CAINFO, $ssl_cert);
//execute post
$result = curl_exec($ch);
//close connection
curl_close($ch);
Any help would be greatly appreciated.
I've figured out the issue: the cURL call was failing, but only because the URL was incorrect, which was a problem that originalted elsewhere. The URL is created from a base URL plus some parameters, but the function which grabs the base URL was grabbing hostname instead of domain name. Some users log in via VPN, which was causing the hostname to read slightly differently, which is why the URL was wrong. OK, that was rather convoluted, but the issues has been resolved. :-)
Related
I have the same code running on multiple sites/servers. 2 days ago the code started returning http_code = 0 and the error message "empty reply from server" on one of the servers.
Can anyone shed any light as to why a particular server would be working one day, then not working the next? I have submitted a ticket to the ISP explaining the issue but they cannot seem to find what is wrong (yet).
I guess the question really is, what would/could change on a server to stop this from working?
What is interesting tho is the url I am referencing doesnt get touched on the server returning the error. If I change the url to point to something that doesnt exist, the same error is returned. So it appears that CURL POST references in total are being rejected by the server. I currently have other CURL scripts that are hitting these problem sites that are still working, but they do not have POST options in them.
The issue is definitely related to CURL POST requests on this server, and they are being rejected pretty much immediately.
On the server in question I have 15+ separate accounts and every one of them returns the same result so I dont think its anything I have changed as I know I havent made any wholesale changes to ALL the sites at the time when this issue arose. Of the 6 other sites I have hosted elsewhere, everything is still working fine with exactly the same code.
I have tried various combinations/changes to options from posts I have read but nothing has really made a difference, the working sites still work and the non-working sites still dont.
function sendWSRequest($url, $xml) {
// $headers[] = 'Content-Type: application/xml; charset=utf-8';
$headers[] = 'Content-Type: text/xml; charset=utf-8';
$headers[] = 'Content-Length: ' . strlen($xml);
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_HEADER, true);
// curl_setopt($ch, CURLINFO_HEADER_OUT, false);
curl_setopt($ch, CURLOPT_HTTPHEADER, $headers);
curl_setopt($ch, CURLOPT_POSTFIELDS, $xml);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
// curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0);
// curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 0);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 20);
$result = curl_exec($ch);
if($result===false) {
print 'error with curl - '.curl_error($ch).'<br />';
}
$info = curl_getinfo($ch);
curl_close($ch);
return $result;
}
Any help would be greatly appreciated.
EDIT
To summarise based on further investigations, when the script errors, nothing registers in the server access logs. So it appears that CURL requests containing POST options are being rejected before access is granted/logged...
Cheers
Greg J
I know this is an old thread, but I found a solution that may save someone else a headache:
I just began encountering this exact problem with a web site hosted at GoDaddy which was working until recently. To investigate the problem I created an HTML page with a form containing the same fields being submitted in the POST data via cURL.
The browser-submitted HTML form worked while the cURL POST resulted in the Empty reply from server error. So I examined the difference between the headers submitted by the browser and those submitted by cURL using the PHP apache_request_headers() function on my development system where both the cURL and browser submissions worked.
As soon as I added the "User-Agent" header submitted by my browser to the cURL POST, the problem site worked as expected instead of returning an empty reply:
CURLOPT_HTTPHEADER =>
array("User-Agent: Mozilla/5.0 (Windows NT 6.3; WOW64; rv:31.0) Gecko/20100101 Firefox/31.0")
I did not experiment with other/simpler User-Agent headers since this quick fix solved my problem.
According to the PHP manual, upload should be urlencoded:
CURLOPT_POSTFIELDS The full data to post in a HTTP "POST" operation.
[...] This parameter can either be
passed as a urlencoded string like 'para1=val1¶2=val2&...' or as
an array with the field name as key and field data as value. If value
is an array, the Content-Type header will be set to
multipart/form-data. As of PHP 5.2.0, value must be an array if files
are passed to this option with the # prefix. As of PHP 5.5.0, the #
prefix is deprecated and files can be sent using CURLFile.
So you might try with
curl_setopt($ch, CURLOPT_POST, true);
curl_setopt($ch, CURLOPT_HEADER, true);
curl_setopt($ch, CURLOPT_POSTFIELDS, 'xml=' . urlencode($xml));
and see what happens. Or, anyway, start with an empty or very simple FIELD to see if it at least arrives to the destination server.
Update
I've checked this setup on a test machine and it works. The problem is then likely not to be PHP or cURL side at all, at this point. Can you request a list of software/hardware updates on that machine and network in the last days?
Otherwise, I'd try to capture outgoing traffic so as to determine whether the request leaves the server (and the problem is in between, e.g. a misconfigured firewall: hence my inclusion of "hardware" in the change list), or doesn't leave the server at all. In this latter case the culprits could be:
updates to cURL library
updates to PHP cURL module and/or PHP binaries
updates to "software" firewall rules
updates to ancillary network libraries (unlikely; they should be HTTP agnostic and not differentiate a POST from, say, a GET or HEAD)
OK, as it turns out, a rather reluctant host recompiled Apache2 and PHP which has resolved the issue.
The host claims (their opening statement to my support ticket) that no updates to either Apache2 or PHP had been performed around the time the issue occurred.
the behavior was as such that it wasnt even acknowledging a CURL request that contained the POST commands. The target URL was never reached.
Thank you so much to all who provided their advice. Particularly Isemi who has gone to great lengths to find a resolution.
I am making a web development service which I am trying out at school.
The idea is to be able to run my CMS with a student's content.
I have made version 2 of the yet un-named CMS which allows the user to make a skin compatible with the CMS and easily use it. This part works fine.
The problem is, I said "We can collect site statistics for your website and notify you if there are any issues with your site." in a presentation, In which people seemed interested. What I need to do is be able to access information from the databases of all my client's sites from my administration domain hosted on another server. I could use a file_get_contents(); on my admin site to retrieve a file on the client's site and echo's it. - Which I think is sloppy
Is there any other way, bearing in mind that my host does not allow database access from one site to another!
Thanks, especially for being able to read my overly long sentences!
Hope this helps:
function file_get_data($url) {
$ch = curl_init();
curl_setopt($ch, CURLOPT_HEADER, 0);
//Set curl to return the data instead of printing it to the browser.
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_URL, $url);
$data = curl_exec($ch);
curl_close($ch);
return $data;
}
We're having problems with an api we are using.
Here is the code we're using (naming no names on the api front)
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://apiurl.com/whatever/api/we/call');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$ch_output = curl_exec($ch);
curl_close($ch);
This response times out, but not for ages. This is hideously slowing down our web app, and as such further code breaks because of the bad return value. This I can fix, however the response timeout I don't know how to fix. Is there any way to quickly see if a url is "responding" (e.g. something like ping in terminal) before trying to do a curl request?
Thank you.
Do you mean usingcurl_setopt($ch,CURLOPT_CONNECTTIMEOUT,NUMERIC_TIMEOUT_VALUE);to set the timeout?
Your best option would be to set the timeout on curl to a more acceptable level. There are several timeout options available for DNS lookup, connect timeout, transfer timeout, etc. More information is available here http://php.net/manual/en/function.curl-setopt.php
I'm encountering a problem to which I can't find a solution anywhere. Even worse, none else seems to have this problem so I'm probably doing something very stupid.
Some background info: I'm trying to make a proxy-like page that forwards an AJAX request to a different server. This to circumvent the same-domain-policy. All I want this code to do is take the POST variables, forward them to a different page, and then return the results. It's been working but for 1 thing: every time it waits for the timeout to continue. I've put it to 1 second now, so it's doing ok for now, but I'd rather have a fast response and proper timeout.
Here's my code:
// create a new cURL resource
$call = curl_init();
// set URL and other appropriate options
curl_setopt($call, CURLOPT_URL, $url);
curl_setopt($call, CURLOPT_POST, true);
curl_setopt($call, CURLOPT_POSTFIELDS, $params);
curl_setopt($call, CURLOPT_HEADER, false);
curl_setopt($call, CURLOPT_RETURNTRANSFER, true);
curl_setopt($call, CURLOPT_CONNECTTIMEOUT, 1);
// grab URL and pass it to the browser
$response = curl_exec($call);
// close cURL resource, and free up system resources
curl_close($call);
echo $response;
I've tried sending a "Connection: close" header with it, and several ways to make the target code specify that it's done running (setting Content-length, flushing, die(), etc.). At this point I really don't know what's going on, what surprises me most is that I can't find anyone with a similar problem.
Who can help me?
This would make sense if the server weren't actually completing the request. This would be expected in a page streaming or service streaming scenario. Are you sure that the server is actually returning a full and complete HTTP response to each request?
Sounds like it's trying to connect, timing out, and the retry is working.
This fixed it for me:
curl_setopt($ch, CURLOPT_IPRESOLVE, CURL_IPRESOLVE_V4);
I can connect on the commandline via ipv6, so I don't know why this helps.
What is the best way to check if a given url points to a valid file (i.e. not return a 404/301/etc.)? I've got a script that will load certain .js files on a page, but I need a way to verify each URL it receives points to a valid file.
I'm still poking around the PHP manual to see which file functions (if any) will actually work with remote URLs. I'll edit my post as I find more details, but if anyone has already been down this path feel free to chime in.
The file_get_contents is a bit overshooting the purpose as it is enough to have the HTTP header to make the decision, so you'll need to use curl to do so:
<?php
// create a new cURL resource
$ch = curl_init();
// set URL and other appropriate options
curl_setopt($ch, CURLOPT_URL, "http://www.example.com/");
curl_setopt($ch, CURLOPT_HEADER, 1);
curl_setopt($ch, CURLOPT_NOBODY, 1);
// grab URL and pass it to the browser
curl_exec($ch);
// close cURL resource, and free up system resources
curl_close($ch);
?>
one such way would be to request the url and get a response with a status code of 200 back, aside from that, there's really no good way because the server has the option of handling the request however it likes (including giving you other status codes for files that exist, but you don't have access to for a number of reasons).
If your server doesn't have fopen wrappers enabled (any server with decent security won't), then you'll have to use the CURL functions.