I know there's questions about this but so far none has helped me solve my problem.
I have a PHP script whose job is to send scheduled e-mails. It does this by calling a web service, which I control, via cURL.
Run in the browser, it works fine. Run via CRON, the cURL response is empty. It never reaches the web service; I know this because I had the WS write a text file when it's contacted. It does if you access via browser, not via CRON.
I know CRON runs in a different environment and I'm not relying on any environmental vars e.g. $_SERVER. The path to require() isn't the problem, either, as it's successfully getting data out of that file to connect to the DB.
This question suggested adding the following cURL opts, which I've done, but no dice:
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 0);
Here's my PHP script:
//prep
define('WS_URL', 'http://mywebservicedomain.com/index.php/web_service');
require '../application/config/database.php';
$db = new mysqli($db['default']['hostname'], $db['default']['username'], $db['default']['password'], $db['default']['database']) or die('failed to connect to DB');
//get scheduled e-mails
$res = $db->query('SELECT * FROM _cron_emails WHERE send_when < NOW()');
//process each...
while ($email_arr = $res->fetch_assoc()) {
//...get API connection info
$api_accnt = $db->query($sql = 'SELECT id, secret FROM _api_accounts WHERE project = "'.$email_arr['project'].'"');
$api_accnt = $api_accnt->fetch_assoc();
//...recreate $_POST env
$tmp = json_decode($email_arr['post_vars'], 1);
$_POST = array();
foreach($tmp as $key => $val) $_POST[$key] = !is_array($val) ? $val : implode(',', $val);
//...call API to send e-mail
$ch = curl_init($url = WS_URL.'/send_scheduled_email/'.$email_arr['email_id'].'/'.$email_arr['store_id'].'/'.$email_arr['item_id']);
curl_setopt_array($ch, array(
CURLOPT_RETURNTRANSFER => true,
CURLOPT_POST => true,
CURLOPT_SAFE_UPLOAD => true,
CURLOPT_SSL_VERIFYHOST => 1, //<-- tried adding this
CURLOPT_SSL_VERIFYPEER => 1, //<-- ditto
CURLOPT_POSTFIELDS => array_merge($_POST, array(
'api_key' => $api_accnt['id'],
'api_secret' => $api_accnt['secret'],
))
));
$resp = curl_exec($ch); //<-- empty when CRON
//if e-mail was sent OK, or decider script said it was ineligible, delete it from the queue
if (($resp == 'ok' || $resp == 'ineligible'))
$db->query('DELETE FROM _cron_emails WHERE id = '.$email_arr['id'].' LIMIT 1');
}
Does anyone have any idea why it fails to connect to my web service via cURL only when run in CRON?
Here's the CRON job:
/usr/bin/php /home/desyn/public_html/cron/send_scheduled_emails.php
In the end this turned out to be a very particular, and I imagine uncommon, problem. I doubt it'll prove too much help to others, but I leave it here in case.
Essentially, my server didn't like cURL'ing itself, via a domain.
The server being called by cURL in the CRON script was the same server that the CRON task was running on (but a different account and website.)
This has not proved to be a problem via a browser; only via CRON.
Changing the cURL request to use the server's local IP, rather than using the domain, solved this.
Related
I'm trying to simply read the Philips Hue lights information from my home with the following code:
$fp = fopen(dirname(__FILE__).'/errorlog.txt', 'a');
$ch = curl_init();
curl_setopt_array($ch, array(
CURLOPT_RETURNTRANSFER => 1,
CURLOPT_URL => 'http://119.119.20.20:2827/api/Js82jH2lao-pAiws89S9A-k9hHsukw72/lights',
CURLOPT_VERBOSE => true,
CURLOPT_STDERR => $fp
));
$resp = curl_exec($ch);
curl_close($ch);
print_r($resp);
It returns nothing. Looking at errorlog.txt it says:
* About to connect() to 119.119.20.20 port 2827 (#0)
* Trying 119.119.20.20... * Connection refused
* couldn't connect to host
* Closing connection #0
I'm able to read the data and change light settings through a site like hurl.it which tells me I've setup my router correctly. allow_url_fopen on my server is on. I'm using curl because I want to do a PUT request as well. I don't want to use a library for simply turning on and off an light.
How can I make this work?
Edit to clarify: I'm using an external server to host the php, which communicates to my Philips Hue bridge at home. You can assume I forwarded my port correctly. No VPN.
My guess is it's a users/permissions issue.
Does www-data/http user have permission to use curl on your server? All php scripts will be executed as that user so without correct permissions curl will fail giving this error.
Which user created the php script? Have you changed file permissions to allow other users correct privileges?
Having said all that, you stated that you don't want to use a library for something so simple, why even bother with curl? file_get_contents can PUT, POST etc out of the box.
Get status from bridge as associative array:
$result = json_decode(file_get_contents('http://119.119.20.20:2827/api/Js82jH2lao-pAiws89S9A-k9hHsukw72/lights'), true);
Turn off all lights with a PUT request:
$data = '{"on":false}';
$result = file_get_contents('http://119.119.20.20:2827/api/Js82jH2lao-pAiws89S9A-k9hHsukw72/groups/0/action', null, stream_context_create(array(
'http' => array(
'method' => 'PUT',
'header' => 'Content-Type: application/json' . "\r\n"
. 'Content-Length: ' . strlen($data) . "\r\n",
'content' => $data
),
)));
Just tried it on my Hue setup. Works for me.
So, I am creating a script that will automatically logging in the user to a remote site. it's work as expected, but when user browse any pages in the remote site, it's keep asking user to login again.
how to prevent this? is this possible to do so? here is the current code:
$ch = curl_init();
curl_setopt_array($ch, array(
CURLOPT_URL=>$url,
CURLOPT_HEADER=>false,
CURLOPT_RETURNTRANSFER=>true,
CURLOPT_FOLLOWLOCATION=>true,
CURLOPT_POST=>true,
CURLOPT_POSTFIELDS=>$login,
CURLOPT_COOKIEFILE=>$rootPath.'/tmpfile/cookie.txt',
CURLOPT_COOKIEJAR=>$rootPath.'/tmpfile/cookie.txt'
));
$content = curl_exec($ch);
Your code seems workable. But most of the cases the common mistake is not closing the curl handle before making second one. So close it, and reinitialize it again.
curl_close($ch);
If this still doesn't solve your problem, then you can debug your code for following things.
1) Check what you have inside the cookie.txt file. You can print the content of the file just after the login request.
2) Run the script with verbose mode enabled to see what requests are sending for user browsing pages.
CURLOPT_VERBOSE => true,
3) You can also manually copy the cookie from the file (or from browser) and use it for the 'user browse' request and check whether the cookie works or not.
CURLOPT_HTTPHEADER => array("Cookie: session-x=hello-kookie"), // example
I have a working 3rd party php codes verify the receipt sent from ipad.
but it seems https://sandbox.itunes.apple.com/verifyReceipt no long response to my php code.
there's not even an error stat like {"status":21000} if I visite the url directly.
I've tried different ways on server side, like curl_exec($ch); or file_get_contents
even the simple test get nothing returned at all.
$result = file_get_contents('https://sandbox.itunes.apple.com/verifyReceipt');
echo $result;
I wonder if this is caused by heartbleed and what can I do?
my original working php code:
if ($isSandbox) {
$endpoint = 'https://sandbox.itunes.apple.com/verifyReceipt';
}
else {
$endpoint = 'https://buy.itunes.apple.com/verifyReceipt';
}
// Connect to Apple server and validate.
$postData = json_encode(array("receipt-data" => $receipt));
$options = array(
'http' => array(
'header' => "Content-type: application/x-www-form-urlencoded",
'method' => 'POST',
'content' => $postData
),
);
$context = stream_context_create($options);
$result = file_get_contents($endpoint, false, $context);
Well, after hours searching and checking, I finally figure this out, here's what happened:
I upload the test file to another server, turns out it's working so
not a php source problem.
I added
error_reporting(E_ALL);
ini_set('display_errors', '1');
to the php file got the error code is file-get-contents Couldn't
resolve host name
so I remember I yum update all the server software yesterday to
deal with heartbleed. and seems some update modified the
resolv.conf.
I changed the resolv.conf added google nameserver
8.8.8.8 but still not working
restart the nginx and php-fpm, problem solved
After a user signs up on my website i need to send a soap request in a method that is not blocking to the user. If the soap server is running slow I don't want the end user to have to wait on it. Is there a way I can send the request and let my main PHP application continue to run without waiting from a response from the soap server? If not, is there a way to set a max timeout on the soap request, and handle functionality if the request is greater than a max timeout?
Edit:
I would ideally like to handle this with a max timeout for the request. I have the following:
//ini_set('default_socket_timeout', 1);
$streamOptions = array(
'http'=>array(
'timeout'=>0.01
)
);
$streamContext = stream_context_create($streamOptions);
$wsdl = 'file://' . dirname(__FILE__) . '/Service.wsdl';
try{
if ( file_get_contents( $wsdl ) ) {
$this->_soapClient = new SoapClient($wsdl,
array(
'soap_version' => SOAP_1_2,
'trace' => true,
'stream_context' => $streamContext
)
);
$auth = array('UserName' => $this->_username, 'Password' => $this->_password);
$header = new SoapHeader(self::WEB_SERVICE_URL, "WSUser", $auth);
$this->_soapClient->__setSoapHeaders(array($header));
}//if
}
catch(Exception $e){
echo "we couldnt connect". $e;
}
$this->_soapClient->GetUser();
I set the timeout to 0.01 to try and force the connection to timeout, but the request still seems to fire off. What am I doing wrong here?
I have had the same issues and have implemented solution !
I have implemented
SoapClient::__doRequest();
To allow multiple soap calls using
curl_multi_exec();
Have a look at this asynchronous-soap
Four solutions:
Use AJAX to do the SOAP -> Simplest SOAP example
Use AJAX to call a second PHP file on your server which does the SOAP (best solution imo)
Put the SOAP request to the end of your PHP file(s) (not the deluxe solution)
Use pcntl_fork() and do everything in a second process (I deprecate that, it might not work with every server configuration)
Depending on the way you implement this, PHP has plenty of timeout configurations,
for example socket_set_timeout(), or stream_set_timeout() (http://php.net/manual/en/function.stream-set-timeout.php)
I have a php site www.test.com.
In the index page of this site ,i am updating another site's(www.pgh.com) database by the following php code.
$url = "https://pgh.com/test.php?userID=".$userName. "&password=" .$password ;
$response = file_get_contents($url);
But,now the site www.pgh.com is down.So it also affecting my site 'www.test.com'.
So how can i add some exception or something else to this code,so that my site should work if other site is down
$response = file_get_contents($url);
if(!$response)
{
//Return error
}
From the PHP manual
Adding a timeout:
$ctx = stream_context_create(array(
'http' => array(
'timeout' => 1
)
)
);
file_get_contents("http://example.com/", 0, $ctx);
file_get_contents returns false on fail.
You have two options:
Add a timeout to the file_get_contents call using stream_get_context() (The manual has good examples; Docs for the timeout parameter here). This is not perfect, as even a one second's timeout will cause a notable pause when loading the page.
More complex but better: Use a caching mechanism. Do the file_get_contents request in a separate script that gets called frequently (e.g. every 15 minutes) using a cron job (if you have access to one). Write the result to a local file, which your actual script will read.