I'm using Google reCaptcha for my webpage.
In testing mode everything works fine. No SSL.
When I test my webpage in production environment the following errors occures:
Warning: file_get_contents(): SSL operation failed with code
1. OpenSSL Error messages: error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed in
/vendor/google/recaptcha/src/ReCaptcha/RequestMethod/Post.php
on line 68 Warning: file_get_contents():
Failed to enable crypto in
/vendor/google/recaptcha/src/ReCaptcha/RequestMethod/Post.php
on line 68 Warning:
file_get_contents(https://www.google.com/recaptcha/api/siteverify):
failed to open stream: operation failed in
/vendor/google/recaptcha/src/ReCaptcha/RequestMethod/Post.php
on line 68 ["invalid-json"]
I'm calling the reCaptcha API like this:
<script src="https://www.google.com/recaptcha/api.js?onload=onloadCallback&render=explicit"
async defer></script>
as described on the developer page from google.
I'm hosting my webpage at hoststar.ch. There is TSL 1.2 running.
I hope somebody could help me.
In response to your last comment I realise you cannot change Google's reCaptcha api - what I meant was simply to do a file_get_contents actually on example.com ( it does exist ) as a test to see if you can retrieve any content using that method as some webhosts disable the associated functionality.
However, with respect to the Google reCatcha API you might need to specify additional parameters to the file_get_contents function call, notably setting the context options specifically for SSL.
$secret = 'Your google secret';
$captcha = trim( $_POST['g-recaptcha-response'] );
$ip = $_SERVER['REMOTE_ADDR'];
$url = "https://www.google.com/recaptcha/api/siteverify?secret={$secret}&response={$captcha}&remoteip={$ip}";
$options=array(
'ssl'=>array(
'cafile' => '/path/to/cacert.pem',
'verify_peer' => true,
'verify_peer_name' => true,
),
);
$context = stream_context_create( $options );
$res=json_decode( file_get_contents( $url, FILE_TEXT, $context ) );
if( $res->success ){/* all good */}
else{ /* captcha failed */ }
If you don't already have a copy of cacert.pem or ca-bundle.crt you can download them from their respective links. The path to the cafile can use either - save a copy to your host and correct the path to suit your environment.
Change file_get_contents to curl. Here is the code,
Change-
$verify=file_get_contents("https://www.google.com/recaptcha/api/siteverify?secret={$secret}&response={$response}");
$captcha_success=json_decode($verify); /*store json response*/
To this code :
$ch = curl_init("https://www.google.com/recaptcha/api/siteverify?secret={$secret}&response={$response}");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$verify = curl_exec($ch);
$captcha_success=json_decode($verify); /*store json response*/
Please note $secret is the secret key stored on server side and $response is the recaptcha response send through post from front end.
Related
I'm trying to fetch data from the Internet using PHP. Being at work, I'm behind a password protected proxy which seems to cause trouble.
I have consulted many posts on StackOverflow and several other platforms, each one giving me a different solution, none of which could fill my needs.
var_dump(stream_get_wrappers());
$opts = array("http" => ['proxy'=>'user:password#webproxy.xxx.intra:####',
'request_fulluri' => True]
);
stream_context_set_default($opts);
$homepage = file_get_contents("www.google.com");
echo $homepage;
I've added var_dump(stream_get_wrappers()); as advised in one of the posts I've previously read, returning : [https, ftps, compress.zlib, compress.bzip2, php, file, glob, data, http, ftp, phar, zip]
And the warning I get is the following :
PHP Warning: file_get_contents(www.google.com): failed to open stream: No such file
or directory in /home/user/project/app/connector.php on line 15
Line 15 being $homepage = file_get_contents("www.google.com");
I've been stuck there for way too long, and any help is greatly appreciated.
Thank you.
EDIT : I have added "http://" to the beginning of the address, giving me this error :
PHP Warning: file_get_contents(google.com): failed to open stream:
php_network_getaddresses: getaddrinfo failed: Name or service not
known in /home/user/project/app/connector.php on line 15
For some reason this error got displayed twice, the second one being almost the same with less information.
After searching even more, here is what I have come up with :
$auth = base64_encode('login:password');
$opts = array(
'http' => array(
'proxy' => 'tcp://proxy_host:proxy_port',
'request_fulluri' => true,
'header' => "Proxy-Authorization: Basic $auth",
),
);
$context = stream_context_create($opts);
$file = file_get_contents("http://www.google.com", False, $context);
echo $file;
This seems to work fine.
Thanks for your help and advices.
I am developing a WordPress plugin for a client to interact with Rightmove's Realtime Data API. I have been sent example PHP code from Rightmove to use. In order to interact with Rightmove's API, you need to have been generated and given a security certificate from Rightmove. I have been sent that and have the password for it. Before executing the curl, the code uses the stat() function on the certificate. Below is the code (with the password crossed out for obvious reasons).
$url = 'https://adfapi.adftest.rightmove.com/';
$ch = curl_init();
$cert = plugins_url("safetech-righmove-realtime-data-feed-realhomes/safetech.pem");
if (!stat($cert)){
die('No cert');
}
curl_setopt_array($ch, array(
CURLOPT_URL => $url,
CURLOPT_VERBOSE => 1,
CURLOPT_SSL_VERIFYPEER => false,
CURLOPT_SSLVERSION => 5,
CURLOPT_SSLCERT => $cert,
CURLOPT_SSLCERTPASSWD => 'xxxxxxxx'
));
if(!curl_exec($ch)) {
echo 'Curl error: ' . curl_error($ch);
}
curl_close($ch);
As you can if the stat fails, it stops the code and prints "No cert". This exactly what is happening. I have uploaded the certificate into the plugins folder and can access it by manually going to the URL in the browser. So the file clearly exists.
If I comment out the if statement and just execute the curl, Rightmove returns an authentication error, implying the certificate is not being used? I have tried using the file_exists() function but that still returns the "No cert" error.
The .pem file has read permissions.
Any idea what is causing the issue? I've been trying various things for hours with no result.
I am attempting to use #abraham's TwitterOAuth 0.5.3 library for PHP, but when I make a request to request a token for the 3-legged authorization, I receive an HTTP 500 as a response.
Here is how I have the code set up in PHP:
<?php
/* Start session and load library. */
session_start();
require_once('config.php');
require_once('twitteroauth/autoload.php');
use Abraham\TwitterOAuth\TwitterOAuth;
/* Build TwitterOAuth object with client credentials. */
$connection = new TwitterOAuth(CONSUMER_KEY, CONSUMER_SECRET);
/* Get temporary credentials. */
// Error occurs on the following line, unable to dump $request_token
$request_token = $connection->oauth('oauth/request_token', array('oauth_callback' => OAUTH_CALLBACK));
//print_r($request_token); // <-- Never reached!!
I know that this problem is not within the Twitter API, as I have verified that I can access my Twitter account via the Dev console.
In addition, I have verified to some degree that the TwiterOAuth library is working, by following the Authorization flow example provided with the library. The example can also access my Twitter account.
I just can't figure out what is going on as I am unable to properly authorize my PHP application to have access to my Twitter account.
What am I doing wrong?
It turns out that a response was never obtained. As a result, attempting to process a response, when there was none resulted in errors on the server side.
One of the PHP functions that Twitter OAuth relies upon is curl. I had tested to see if curl_init existed:
print function_exists('curl_init') ? 'curl_init is enabled' : 'curl_init is disabled';
and I erroneously assumed that curl_exec was also enabled. (Why would you leave curl_init enabled, but only disable curl_exec?)
That assumption was incorrect as my web hosting provider has disabled curl_exec "due to security concerns" and I was unaware of this. In addition, my call to use the Twitter API has worked in the past, so this was new behavior.
It took me a while to come back to testing curl_exec. I verified that I was receiving a valid TwitterOauth object and eventually wound my way into the TwitterOauth class and into the request function.
I was receiving no curl error, but was the response from curl_exec was null (not TRUE or FALSE as expected). I thought that this was unusual and at first thought that curl was missing a configuration option.
However, it was not.
So, if you run into problems with this library (which has worked great for me in the past), it may be that your hosting provider disabled curl_exec.
You can test this scenario via the following PHP code:
print function_exists('curl_exec') ? 'curl_exec is enabled' : 'curl_exec is disabled';
My problem was fixed in another way. After checking (as per jhenderson2099 answer) that my hosting had curl_exec enabled (which it did). I found out that my problem was caused by two lines in src/TwitterOauth.php (TwitterOauth class):
$bundlePath = CaBundle::getSystemCaRootBundlePath(); <-- Comment this line
$options = [
// CURLOPT_VERBOSE => true,
CURLOPT_CONNECTTIMEOUT => $this->connectionTimeout,
CURLOPT_HEADER => true,
CURLOPT_RETURNTRANSFER => true,
CURLOPT_SSL_VERIFYHOST => 2,
CURLOPT_SSL_VERIFYPEER => true,
CURLOPT_TIMEOUT => $this->timeout,
CURLOPT_USERAGENT => $this->userAgent,
$this->curlCaOpt($bundlePath) => $bundlePath,<-- Comment this line
];
so that your code will look like this:
//$bundlePath = CaBundle::getSystemCaRootBundlePath();
$options = [
// CURLOPT_VERBOSE => true,
CURLOPT_CONNECTTIMEOUT => $this->connectionTimeout,
CURLOPT_HEADER => true,
CURLOPT_RETURNTRANSFER => true,
CURLOPT_SSL_VERIFYHOST => 2,
CURLOPT_SSL_VERIFYPEER => true,
CURLOPT_TIMEOUT => $this->timeout,
CURLOPT_USERAGENT => $this->userAgent,
//$this->curlCaOpt($bundlePath) => $bundlePath,
];
I'm trying to use PHPCrawl (http://sourceforge.net/projects/phpcrawl/) to trawl a website delivered over HTTPS.
I can see that there is support for SSL in the PHPCrawlerHTTPRequest class (openSocket method):
// If ssl -> perform Server name indication
if ($this->url_parts["protocol"] == "https://")
{
$context = stream_context_create(array('ssl' => array('SNI_server_name' => $this->url_parts["host"])));
$this->socket = #stream_socket_client($protocol_prefix.$ip_address.":".$this->url_parts["port"], $error_code, $error_str,
$this->socketConnectTimeout, STREAM_CLIENT_CONNECT, $context);
}
The problem lies in the call to stream_socket_client - although it returns a zero error_code, and no error_str, this->socket is still false.
The documentation for the method states the following:
If the value returned in errno is 0 and the function returned FALSE, it is an indication that the error occurred before the connect() call.
(See http://php.net/manual/en/function.stream-socket-client.php)
So I've tried to use an example provided in the comments section to modify the stream context using 'stream_context_set_option' to set verify_host and verify_peer to false - neither of which seems to have any effect.
I'm not very proficient in PHP or the intricacies of web - does anyone know either:
What condition (specifically) can cause this call to fail?
OR
A workaround for the issue?
I should note - I am using Facebook (HTTPS) as the test server.
I've found the issue -
PHP versions 5.6.x turn peer verification on by default, and apparently the necesarry cert isn't found sometimes (see this bug report)
The workaround is to drop back to a PHP version prior to 5.6
Old topic, but I had the same problem using the PHPCrawler. What worked for me, is what an user wrote on sourceforge (Source: https://sourceforge.net/p/phpcrawl/bugs/86/#5993).
What you have to do, is to rewrite the stream_context_create call on line 547 in PHPCrawlerHTTPReqeust.class.php into the following:
$context = stream_context_create(array(
'ssl' => array(
'SNI_server_name' => $this->url_parts["host"],
'verify_peer' => false,
'verify_peer_name' => false,
)
));
Hope this helps someone in the future.
Getting a weird error I have no idea how to fix.
This is the error:
( ! ) Catchable fatal error: Argument 2 passed to Guzzle\Service\Client::getCommand() must be an array, string given, called in phar://C:/wamp/www/PHPCodeLance/WebTech/Projects/MIB v2/lib/aws/aws.phar/vendor/guzzle/guzzle/src/Guzzle/Service/Client.php on line 93 and defined in phar://C:/wamp/www/PHPCodeLance/WebTech/Projects/MIB v2/lib/aws/aws.phar/vendor/guzzle/guzzle/src/Guzzle/Service/Client.php on line 113
Call Stack
# Time Memory Function Location
1 0.0009 676280 {main}( ) ..\test.php:0
2 0.0557 3311632 Aws\Ses\SesClient->send_email( ) ..\test.php:30
3 0.0557 3312128 Aws\Common\Client\AbstractClient->__call( ) ..\test.php:30
4 0.0557 3312208 Guzzle\Service\Client->__call( ) ..(null):103
5 0.0557 3312296 Guzzle\Service\Client->getCommand( ) ..(null):93
This is the code I used (straight from the AWS page)
$client = SesClient::factory(array(
'key' => '',
'secret' => '',
'region' => 'us-east-1'
));
$response = $client->send_email(
'no-reply#amazon.com', // Source (aka From)
array('ToAddresses' => array( // Destination (aka To)
'myemail#hotmail.nl'
)),
array( // Message (short form)
'Subject.Data' => 'Email Test ' . time(),
'Body.Text.Data' => 'This is a simple test message ' . time()
)
);
// Success?
var_dump($response->isOK());
UPDATE!!!:
Fixed the issues above, now I got an SSL certificate issue:
Guzzle\Http\Exception\CurlException: [curl] 60: SSL certificate problem, verify that the CA cert is OK. Details: error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed [url] https://email.us-east-1.amazonaws.com/ in phar://C:/wamp/www/PHPCodeLance/WebTech/Projects/MIB v2/lib/aws/aws.phar/vendor/guzzle/guzzle/src/Guzzle/Http/Curl/CurlMulti.php on line 578
Thanks in advance
For an answer to the first (now supposedly resolved - HOW?) issue, see AWS SDK Guzzle error when sending email with SES
Please, if you have a solution to an issue, particularly one as arcane as this, POST IT for others to use.
First of all, it seems that you should include this code for instantiating the client and sending the email within a try-catch block, that will certainly resolve the Catchable fatal error part and allow your code to continue executing.
As far as the getCommand parameter problem, my guess is that there is some issue with your arguments to send_email() that are passed down the call stack. Without digging through the SDK I don;t know off the top of my head what arguments are specifically passed to getCommand, but you have all the information you need there to debug the issue, as you should be able to map how your arguments are passed through each of the calls shown in the stack trace, debugging along the way to verify what is passed to each function is what is expected.
The problem with the SSL is because CURL does not bundle CA certs anymore, you'd need to set the proper CA info.
Solution 1 (Changes to PHP.ini):
Download CA bundle (cacert.pem) from http://curl.haxx.se/docs/caextract.html
Place it on your local system (for eg: C:\xampp\cacert.pem)
Open your php.ini
Set the curl.ca_info options to point to the location of the cacert.pem
Example: curl.ca_info="C:\xampp\cacert.pem"
Restart Apache
Solution 2 (Set options before each CURL call)
Download CA bundle (cacert.pem) from http://curl.haxx.se/docs/caextract.html
Place it on your local system (for eg: C:\xampp\cacert.pem)
Write the following code:
curl_setopt ($ch, CURLOPT_SSL_VERIFYPEER, TRUE);
curl_setopt ($ch, CURLOPT_CAINFO, "pathto\cacert.pem");
Source: http://tumblr.wehavefaces.net/post/52114563111/environment-windows-xampp-curl-library