i have written a daemon to fetch some stuff from mysql and make some curl requests based on the info from mysql. since i'm fluent in php i've written this daemon in php using System_Daemon from pear.
this works fine but i'm curious about the best approach for connecting to mysql. feels weird to create a new mysql connection every couple of seconds, should i try a persistent connection? any other input? keeping potential memory leaks to a minimum is of the essence...
cleaned up the script, attached below. removed the mysql stuff for now, using a dummy array to keep this unbiased:
#!/usr/bin/php -q
<?php
require_once "System/Daemon.php";
System_Daemon::setOption("appName", "smsq");
System_Daemon::start();
$runningOkay = true;
while(!System_Daemon::isDying() && $runningOkay){
$runningOkay = true;
if (!$runningOkay) {
System_Daemon::err('smsq() produced an error, '.
'so this will be my last run');
}
$messages = get_outgoing();
$messages = call_api($messages);
#print_r($messages);
System_Daemon::iterate(2);
}
System_Daemon::stop();
function get_outgoing(){ # get 10 rows from a mysql table
# dummycode, this should come from mysql
for($i=0;$i<5;$i++){
$message->msisdn = '070910507'.$i;
$message->text = 'nr'.$i;
$messages[] = $message;
unset($message);
}
return $messages;
}
function call_api($messages=array()){
if(count($messages)<=0){
return false;
}else{
foreach($messages as $message){
$message->curlhandle = curl_init();
curl_setopt($message->curlhandle,CURLOPT_URL,'http://yadayada.com/date.php?text='.$message->text);
curl_setopt($message->curlhandle,CURLOPT_HEADER,0);
curl_setopt($message->curlhandle,CURLOPT_RETURNTRANSFER,1);
}
$mh = curl_multi_init();
foreach($messages as $message){
curl_multi_add_handle($mh,$message->curlhandle);
}
$running = null;
do{
curl_multi_exec($mh,$running);
}while($running > 0);
foreach($messages as $message){
$message->api_response = curl_multi_getcontent($message->curlhandle);
curl_multi_remove_handle($mh,$message->curlhandle);
unset($message->curlhandle);
}
curl_multi_close($mh);
}
return $messages;
}
Technically if it's a daemon, it runs in background and doesn't stop until you ask it to. There's is no need to use a persistent connection in that case, and even, you probably shouldn't. I'd expect the connection to close when I kill the daemon.
Basically, you should open a connection on startup, and close it on shutdown, and that's about it. However, you should put some error trapping in there in case the connection drops unexpectedly while the daemon is running, so it either shutdowns gracefully (by logging a connection drop somewhere) or have it retry a reconnection later.
maybe before while statement just add mysql_pconnect, but I don't now anything about php daemons...
Related
I am creating a php application which inserts the record in the DB for every 10 - 20 automatically in heroku to achieve that i had created a proc file and i'm using worker inside that . While deploying the application in the heroku there is found out while running the worker I am new to this heroku and PHP. Can any one help me on what this mean and how can i resolve it and make the worker run continuously to insert the record in the Database.
This is line in my proc file contains which is under root of the project
worker: php /app/worker/db_worker.php
My db_worker.php:
<?php
include_once "./db_config.php";
$conn = $connection;
$number_1 = rand(1,50);
$number_2 = rand(60,500);
$insertQuery = "INSERT INTO random_numbers (num_1, num_2) VALUES ('".$number_1."', '".$number_2."')";
$result = mysqli_query($GLOBALS['conn'], $insertQuery);
if($result) {
echo 'Details saved';
} else {
echo 'Failed to save details';
}
?>
The simplest solution would be to run your operations in a loop. For example,
<?php
while (true) {
// assume this can be included repeatedly
include_once "./db_config.php";
$conn = $connection;
$number_1 = rand(1,50);
$number_2 = rand(60,500);
$insertQuery = "INSERT INTO random_numbers (num_1, num_2) VALUES ('".$number_1."', '".$number_2."')";
$result = mysqli_query($GLOBALS['conn'], $insertQuery);
if($result) {
echo 'Details saved';
} else {
echo 'Failed to save details';
}
// recommend to do some memory clean up
// ...
// wait for 5 seconds
sleep(5);
}
But there's a problem. PHP is not considered memory safe. There can be memory leak issue(s) running it for really long time. If this didn't work well, you might simply do this as a static script with the help of scheduler services like cron-job.org.
I've been working on this and couldn't find a way to understand it fully.
I have this code:
<?php
function get2($url) {
// Create a handle.
$handle = curl_init($url);
// Set options...
// Do the request.
$ret = curlExecWithMulti($handle);
// Do stuff with the results...
// Destroy the handle.
curl_close($handle);
}
function curlExecWithMulti($handle) {
// In real life this is a class variable.
static $multi = NULL;
// Create a multi if necessary.
if (empty($multi)) {
$multi = curl_multi_init();
}
// Add the handle to be processed.
curl_multi_add_handle($multi, $handle);
// Do all the processing.
$active = NULL;
do {
$ret = curl_multi_exec($multi, $active);
} while ($ret == CURLM_CALL_MULTI_PERFORM);
while ($active && $ret == CURLM_OK) {
if (curl_multi_select($multi) != -1) {
do {
$mrc = curl_multi_exec($multi, $active);
} while ($mrc == CURLM_CALL_MULTI_PERFORM);
}
}
// Remove the handle from the multi processor.
curl_multi_remove_handle($multi, $handle);
return TRUE;
}
?>
The above script is doing this: I run the PHP and it creates new TCP connection, it returns data and then it closes the connection.
The server is working on HTTP 1.1 and connection: keep-alive.
What i want is if i run the script will create connection, return data and NOT close the connection and when i run the PHP script again will use that same connection (of course if that connection didn't expire after the timeout of the sever).
Is that possible with cURL? Am I understanding the multi in cURL wrong?
When a program exits, all of its open sockets (indeed, all open files) are closed. There is no way to reuse a connection from one instance to another(*). You must re-open a new connection or loop within your application.
If you want to use HTTP Keep-Alive, your program must not exit.
(*) There are ways to keep a socket open inside one process and pass it to others via Unix domain sockets but that is an advanced topic I recommend against; I mention it only for completeness.
how can I check if a php ping returned succesfull or failed using php exec, I have in mind something with a while loop but I'm not sure if ts the best approach, I tried:
exec('ping www.google.com', $output)
but I would have to do a var_dump($output); to see the results, I want for each line the ping command returns to check it
$i = 2;
while(exec('ping www.google.com', $output)) {
if($output) {
echo 'True';
} else {
echo 'False';
}
}
I know this code is WRONG but its kind of what I need, if any of you could give me a head start on how to do it or suggestions I would really appreciate it....THANKS!!
This should do it:
if(exec('ping http://www.google.com')) {
echo 'True';
} else {
echo 'False';
}
I suggest you could use CUrl See Manual but that all depends upon what you are trying to achieve.
Provide more data if needed.
NOTE
You are to use http:// before google.com as that's needed in order to make the ping.
It's probably faster and more efficient and just do it within PHP, instead of exec'ing a shell
$host = '1.2.3.4';
$port = 80;
$waitTimeoutInSeconds = 1;
if($fp = fsockopen($host,$port,$errCode,$errStr,$waitTimeoutInSeconds)){
// It worked
} else {
// It didn't work
}
fclose($fp);
Also some servers will have EXEC disabled for security reasons, so your method won't work on every server setup.
I don't know how to make this.
There is an XML Api server and I'm getting contents with cURL; it works fine. Now I have to call the creditCardPreprocessors state. It has 'in progress state' too and PHP should wait until the progess is finished. I tried already with sleep and other ways, but I can't make it. This is a simplified example variation of what I tried:
function process_state($xml){
if($result = request($xml)){
// It'll return NULL on bad state for example
return $result;
}
sleep(3);
process_state($xml);
}
I know, this can be an infite loop but I've tried to add counting to exit if it reaches five; it won't exit, the server will hang up and I'll have 500 errors for minutes and Apache goes unreachable for that vhost.
EDIT:
Another example
$i = 0;
$card_state = false;
// We're gona assume now the request() turns back NULL if card state is processing TRUE if it's done
while(!$card_state && $i < 10){
$i++;
if($result = request('XML STUFF')){
$card_state = $result;
break;
}
sleep(2);
}
The recursive method you've defined could cause problems depending on the response timing you get back from the server. I think you'd want to use a while loop here. It keeps the requests serialized.
$returnable_responses = array('code1','code2','code3'); // the array of responses that you want the function to stop after receiving
$max_number_of_calls = 5; // or some number
$iterator = 0;
$result = NULL;
while(!in_array($result,$returnable_responses) && ($iterator < $max_number_of_calls)) {
$result = request($xml);
$iterator++;
}
Here is my current code:
$SQL = mysql_query("SELECT url FROM urls") or die(mysql_error()); //Query the urls table
while($resultSet = mysql_fetch_array($SQL)){ //Put all the urls into one variable
// Now for some cURL to run it.
$ch = curl_init($resultSet['url']); //load the urls
curl_setopt($ch, CURLOPT_TIMEOUT, 2); //No need to wait for it to load. Execute it and go.
curl_exec($ch); //Execute
curl_close($ch); //Close it off
} //While loop
I'm relatively new to cURL. By relatively new, I mean this is my first time using cURL. Currently it loads one for two seconds, then loads the next one for 2 seconds, then the next. however, I want to make it load ALL of them at the same time. I'm sure its possible, I'm just unsure as to how. If someone could point me in the right direction, I'd appreciate it.
You set up each cURL handle in the same way, then add them to a curl_multi_ handle. The functions to look at are the curl_multi_* functions documented here. In my experience, though, there were issues with trying to load too many URLs at once (though I can't find my notes on it at the moment), so the last time I used curl_mutli_, I set it up to do batches of 5 URLs at a time.
edit: Here is a reduced version of the code I have using curl_multi_:
edit: Slightly rewritten and lots of added comments, which hopefully will help.
// -- create all the individual cURL handles and set their options
$curl_handles = array();
foreach ($urls as $url) {
$curl_handles[$url] = curl_init();
curl_setopt($curl_handles[$url], CURLOPT_URL, $url);
// set other curl options here
}
// -- start going through the cURL handles and running them
$curl_multi_handle = curl_multi_init();
$i = 0; // count where we are in the list so we can break up the runs into smaller blocks
$block = array(); // to accumulate the curl_handles for each group we'll run simultaneously
foreach ($curl_handles as $a_curl_handle) {
$i++; // increment the position-counter
// add the handle to the curl_multi_handle and to our tracking "block"
curl_multi_add_handle($curl_multi_handle, $a_curl_handle);
$block[] = $a_curl_handle;
// -- check to see if we've got a "full block" to run or if we're at the end of out list of handles
if (($i % BLOCK_SIZE == 0) or ($i == count($curl_handles))) {
// -- run the block
$running = NULL;
do {
// track the previous loop's number of handles still running so we can tell if it changes
$running_before = $running;
// run the block or check on the running block and get the number of sites still running in $running
curl_multi_exec($curl_multi_handle, $running);
// if the number of sites still running changed, print out a message with the number of sites that are still running.
if ($running != $running_before) {
echo("Waiting for $running sites to finish...\n");
}
} while ($running > 0);
// -- once the number still running is 0, curl_multi_ is done, so check the results
foreach ($block as $handle) {
// HTTP response code
$code = curl_getinfo($handle, CURLINFO_HTTP_CODE);
// cURL error number
$curl_errno = curl_errno($handle);
// cURL error message
$curl_error = curl_error($handle);
// output if there was an error
if ($curl_error) {
echo(" *** cURL error: ($curl_errno) $curl_error\n");
}
// remove the (used) handle from the curl_multi_handle
curl_multi_remove_handle($curl_multi_handle, $handle);
}
// reset the block to empty, since we've run its curl_handles
$block = array();
}
}
// close the curl_multi_handle once we're done
curl_multi_close($curl_multi_handle);
Given that you don't need anything back from the URLs, you probably don't need a lot of what's there, but this is how I chunked the requests into blocks of BLOCK_SIZE, waited for each block to run before moving on, and caught errors from cURL.