DEADLINE_EXCEEDED while reading logs from Google Cloud Logging - php

My code is based off the sample mentioned on this page:
use Google\Cloud\Logging\LoggingClient;
$filter = sprintf(
'resource.type="gae_app" severity="%s" logName="%s"',
strtoupper($level),
sprintf('projects/%s/logs/app', 'MY_PROJECT_ID'),
);
$logOptions = [
'pageSize' => 20,
'resultLimit' => 20,
'filter' => $filter,
];
$logging = new LoggingClient();
$logs = $logging->entries($logOptions);
foreach ($logs as $log) {
/* Do something with the logs */
}
This code is (at best) slow to complete, and (at worst) times out on the foreach loop with a DEADLINE_EXCEEDED error.
How can I fix this?

If your query does not match the first few logs it finds, Cloud Logging will attempt to search your entire logging history for the matching logs.
If there are too many logs to filter through, the search will time out with a DEADLINE_EXCEEDED message.
You can fix this by specifying a time frame to search from in your filter clause:
// Specify a time frame to search (e.g. last 5 minutes)
$fiveMinAgo = date(\DateTime::RFC3339, strtotime('-5 minutes'));
// Add the time frame constraint to the filter clause
$filter = sprintf(
'resource.type="gae_app" severity="%s" logName="%s" timestamp>="%s"',
strtoupper($level),
sprintf('projects/%s/logs/app', 'MY_PROJECT_ID'),
$fiveMinAgo
);

Related

PHP - For each loop does not always do all iterations

Currently, I have the following problem:
I have created a WordPress environment that sends personalized emails to subscribers based on their preferences. This has worked for quite some time but for a couple of months, we are experiencing some inconsistencies. These inconsistencies are as followed:
Once in a while, the foreach loop for sending the emails stops in the middle of its execution. For example, we have a newsletter with 4000 subscribers. Once in a while, the program randomly stops its sending procedure at around 2500 emails. When this happens, there are literally no signs of any errors and there is also nothing to be seen in the debug log.
I have tried the following things to fix the issue:
Different sender; we switched from Sendgrid to SMTPeter (Dutch SMTP service)
Delays; we have tried whether placing a wait after x number of emails would have any impact because there might be too many requests per minute, but this was not the case.
Disable plugins; For 5 weeks we thought we had found the problem. WordFence seemed to be the problem, unfortunately, the send function stopped again last week and this did not appear to be causing the problems. Just to show how unstable it really is. It can go well for 5 weeks and then not for 2 weeks.
Rewriting of functions
Logging, we write values ​​to a txt file after every important step to keep track of where the send function stops. This is just to see which users have received an email and which still need to receive it so that we can continue sending it from there.
Debug log, the annoying thing is that even when we have the wp_debug on, nothing comes up that indicates a cause of crashing.
To schedule the sender I use the WP_Cron to run the task in the background. From there the following function is triggered;
Below, the code I wrote in stripped format. I removed all the $message additions as this is just HTML with some variables of ACF for the email. I translated it so it's easier to understand.
<?php
function send_email($edition_id, $post)
{
require_once('SMTPeter.php'); //Init SMTPeter Sender
$myfile = fopen("log.txt", "a") or die("Unable to open file!"); //Open custom logfile
$editionmeta = get_post_meta($edition_id); //Get data of edition
$users = get_users();
$args = array(
'post_type' => 'articles',
'post_status' => 'publish',
'posts_per_page' => -1,
'order' => 'asc',
'meta_key' => 'position',
'orderby' => 'meta_value_num',
'meta_query' => array(
array(
'key' => 'edition_id',
'value' => $edition_id,
'compare' => 'LIKE',
),
),
);
$all_articles = new WP_Query($args); // Get all articles of edition
$i = 0; // Counter users interrested in topic
$j = 0; // Counter sent emails
foreach ($users as $user) { //Loop over all users <---- This is the loop that not always finishes all itterations
$topic_ids = get_field('topicselect_', 'user_' . $user->ID);
$topic_id = $editionmeta['topic_id'][0];
if (in_array($editionmeta['topic_id'][0], $topic_ids)) { // Check if user is interrested in topic.
$i++; // Counter interrested in topic +1.
// Header info
$headerid = $editionmeta['header_id'][0];
$headerimage = get_field('header_image', $headerid);
$headerimagesmall = get_field('header_image_small', $headerid);
// Footer info
$footerid = $editionmeta['footer_id'][0];
$footer1 = get_field('footerblock_1', $footerid);
$footer2 = get_field('footerblock_2', $footerid);
$footer3 = get_field('footerblock_3', $footerid);
$message = '*HTML header newsletter*'; // First piece of content email
if ($all_articles->have_posts()) :
$articlecount = 0; // Set article count to check for empty newsletters
while ($all_articles->have_posts()) : $all_articles->the_post();
global $post;
$art_categories = get_the_category($post->ID); // Get categories of article
$user_categories = get_field('user_categories_', 'user_' . $user->ID); // Get categories user is interrested in
$user_cats = array();
foreach ($user_categories as $user_category) {
$user_cats[] = $user_category->name; // right format for comparison
}
$art_cats = array();
foreach ($art_categories as $art_category) {
$art_cats[] = $art_category->name; // right format for comparison
}
$catcheck = array_intersect($user_cats, $art_cats); // Check if 1 of the article's categories matches one of a user's categories
if (count($catcheck) > 0) { // The moment the array intersect count is greater than 0 (at least 1 category matches), the article is added to the newsletter.
$message .= "*Content of article*"; // Append article to content of newsletter
$articlecount++;
}
endwhile;
endif;
if ($articlecount > 0) { //As soon as the newsletter contains at least 1 article, it will be sent.
$j++; //Sent email counter.
$mailtitle = $editionmeta['mail_subject'][0]; // Title of the email
$sender = new SMTPeter("*API Key*"); // Class SMTPeter sender
$output = $sender->post("send", array(
'recipients' => $user->user_email, // The receiving email address
'subject' => $mailtitle, // MIME's subject
'from' => "*Sender*", // MIME's sending email address
'html' => $message,
'replyto' => "*Reply To*",
'trackclicks' => true,
'trackopens' => true,
'trackbounces' => true,
'tags' => array("$edition_id")
));
error_log(print_r($output, TRUE));
fwrite($myfile, print_r($output, true));
}
}
}
fclose($myfile);
}
All I want to know is the following;
Why can't my code run the foreach completely, every time? I mean, it's quite frustrating to see that it sometimes works like a charm, and the next time it could get stuck again.
Some things I thought about but did not yet implement:
Rewrite parts of the function into separate functions. Retrieving the content and setting up the HTML for the newsletter could be done in a different function. Besides the fact that it would obviously be an improvement for cleaner code, I just wonder if this could actually be the problem.
Can a foreach crash due to a fwrite trying to write to a file that is already being written to? So does our log cause the function to not run properly? (Concurrency, but is this a thing in PHP with its workers?)
Could the entire sending process be written in a different way?
Thanks in advance,
Really looking forward to your feedback and findings

Date and Time Split, then added into another table column

EDIT:
I want to thanks #jimmix for giving me some idea to get started on my last post, But unfortunately, my post was put on hold. Due to the lack of details.
But here are the real scenario, I'm sorry if I didn't explain well my question.
From my CSV file, I have a raw data, then I will upload using my upload() function in into my phpmyadmin database with the table name "tbldumpbio",
See the table structure below:(tbldumpbio)
From my table tbldumpbio data, I have a function called processTimesheet()
Here's the code:
public function processTimesheet(){
$this->load->model('dbquery');
$query = $this->db->query("SELECT * FROM tbldumpbio");
foreach ($query->result() as $row){
$dateTimeExplArr = explode(' ', $row->datetimex);
$dateStr = $dateTimeExplArr[0];
$timeStr = $dateTimeExplArr[1];
if($row->status='C/Out' and !isset($timeStr) || empty($timeStr) ){
$timeStrOut ='';
} else {
$timeStrOut = $dateTimeExplArr[1];
}
if($row->status='C/In' and !isset($timeStr) || empty($timeStr) ){
$timeStrIn ='';
} else {
$timeStrIn = $dateTimeExplArr[1];
}
$data = array(
'ID' => '',
'companyAccessID' => '',
'name' => $row->name,
'empCompID' => $row->empid,
'date' => $dateStr,
'timeIn' => $timeStrIn,
'timeOut' => $timeStrOut,
'status' => '',
'inputType' => ''
);
$this->dbquery->modInsertval('tblempbioupload',$data);
}
}
This function will add another data into another table called "tblempbioupload". But here are the results that I'm getting with:
Please see the below data:(tblempbioupload)
The problem is:
the date should not be duplicated
Time In data should be added if the status is 'C/In'
Time Out data should be added if the status is 'C/Out'
The expected result should be something like this:
The first problem I see is that you have a time expressed as 15:xx:yy PM, which is an ambiguous format, as one can write 15:xx:yy AM and that would not be a valid time.
That said, if what you want is that every time the date changes a row should be written, you should do just that: store the previous date in a variable, then when you move to the next record in the source table, you compare the date with the previous one and if they differ, then you insert the row, otherwise you simply progress reading the next bit of data.
Remember that this approach works only if you're certain that the input rows are in exact order, which means ordered by EmpCompId first and then by date and then by time; if they aren't this procedure doesn't work properly.
I would probably try another approach: if (but this is not clear from your question) only one row per empcompid and date should be present, i would do a grouping query on the source table, finding the minimum entrance time, another one to find the maximum exit date, and use both of them as a source for the insert query.

FB Ads API (#17) User request limit reached

I am working on Facebook ads api to get the account Campaign data.What I am doing here is I get list of all campaigns and doing forloop of each campaign get Campaign stat
$campaignSets = $account->getCampaigns(array(
CampaignFields::ID,
CampaignFields::NAME
));
foreach ($campaignSets as $campaign) {
$campaign = new Campaign($campaign->id);
$fields = array(
InsightsFields::CAMPAIGN_NAME,
InsightsFields::IMPRESSIONS,
InsightsFields::UNIQUE_CLICKS,
InsightsFields::REACH,
InsightsFields::SPEND,
InsightsFields::TOTAL_ACTIONS,
InsightsFields::TOTAL_ACTION_VALUE
);
$params = array(
'date_preset' => InsightsPresets::TODAY
);
$insights = $campaign->getInsights($fields, $params);
}
when executing above code I am getting error as (#17) User request limit reached.
Can anyone help me how to solve this kind of error?
Thanks,
Ronak Shah
You should consider generating a single report against the adaccount which returns insights for all of your campaigns, this should reduce the number of requests required significantly.
Cursor::setDefaultUseImplicitFetch(true);
$account = new AdAccount($account_id);
$fields = array(
InsightsFields::CAMPAIGN_NAME,
InsightsFields::CAMPAIGN_ID,
InsightsFields::IMPRESSIONS,
InsightsFields::UNIQUE_CLICKS,
InsightsFields::REACH,
InsightsFields::SPEND,
InsightsFields::TOTAL_ACTIONS,
InsightsFields::TOTAL_ACTION_VALUE,
);
$params = array(
'date_preset' => InsightsPresets::TODAY,
'level' => 'ad',
'limit' => 1000,
);
$insights = $account->getInsights($fields, $params);
foreach($insights as $i) {
echo $i->campaign_id.PHP_EOL;
}
If you run into API limits, your only option is to reduce calls. You can do this easily by delaying API calls. I assume you are already using a Cron Job, so implement a counter that stores the last campaign you have requested the data for. When the Cron Job runs again, request the data of the next 1-x campaign data (you have to test how many are possible per Cron Job call) and store the last one again.
Also, you should batch the API calls - it will not avoid limits, but it will be a lot faster. As fast as the slowest API call in the batch.
Add this to your code and you'll never have to worry about FB's Rate Limiting/User Limit Reached.
Your script will automatically sleep as soon as you approach the limit, and then pick up from where it left after the cool down. Enjoy :)
import logging
import requests as rq
#Function to find the string between two strings or characters
def find_between( s, first, last ):
try:
start = s.index( first ) + len( first )
end = s.index( last, start )
return s[start:end]
except ValueError:
return ""
#Function to check how close you are to the FB Rate Limit
def check_limit():
check=rq.get('https://graph.facebook.com/v3.3/act_'+account_number+'/insights?access_token='+my_access_token)
call=float(find_between(check.headers['x-business-use-case-usage'],'call_count":','}'))
cpu=float(find_between(check.headers['x-business-use-case-usage'],'total_cputime":','}'))
total=float(find_between(check.headers['x-business-use-case-usage'],'total_time":',','))
usage=max(call,cpu,total)
return usage
#Check if you reached 75% of the limit, if yes then back-off for 5 minutes (put this chunk in your loop, every 200-500 iterations)
if (check_limit()>75):
print('75% Rate Limit Reached. Cooling Time 5 Minutes.')
logging.debug('75% Rate Limit Reached. Cooling Time 5 Minutes.')
time.sleep(300)

How to get full list of Twitter followers using new API 1.1

I am using this https://api.twitter.com/1.1/followers/ids.json?cursor=-1&screen_name=sitestreams&count=5000 to list the Twitter followers list, But I got only list of 200 followers. How to increase the list of Twitter followers using the new API 1.1?
You must first setup you application
<?php
$consumerKey = 'Consumer-Key';
$consumerSecret = 'Consumer-Secret';
$oAuthToken = 'OAuthToken';
$oAuthSecret = 'OAuth Secret';
# API OAuth
require_once('twitteroauth.php');
$tweet = new TwitterOAuth($consumerKey, $consumerSecret, $oAuthToken, $oAuthSecret);
You can download the twitteroauth.php from here: https://github.com/elpeter/pv-auto-tweets/blob/master/twitteroauth.php
Then
You can retrieve your followers like this:
$tweet->get('followers/ids', array('screen_name' => 'YOUR-SCREEN-NAME-USER'));
If you want to retrieve the next group of 5000 followers you must add the cursor value from first call.
$tweet->get('followers/ids', array('screen_name' => 'YOUR-SCREEN-NAME-USER', 'cursor' => 9999999999));
You can read about: Using cursors to navigate collections in this link: https://dev.twitter.com/docs/misc/cursoring
You can't fetch more than 200 at once... It was clearly stated on the documentation where count:
The number of users to return per page, up to a maximum of 200. Defaults to 20.
you can somehow make it via pagination using
"cursor=-1" #means page 1, "If no cursor is provided, a value of -1 will be assumed, which is the first “page."
Here's how I run/update full list of follower ids on my platform. I'd avoid using sleep() like #aphoe script. Really bad to keep a connection open that long - and what happens if your user has 1MILL followers? You going to keep that connection open for a week? lol If you must, run cron or save to redis/memcache. Rinse and repeat until you get all the followers.
Note, my code below is a class that's run through a cron command every minute. I'm using Laravel 5.1. So you can probably ignore a lot of this code, as it's unique to my platform. Such as the TwitterOAuth (which gets all oAuths I have on db), TwitterFollowerList is another table and I check if an entry already exists, TwitterFollowersDaily is another table where I store/update total amount for the day for the user, and TwitterApi is the Abraham\TwitterOAuth package. You can use whatever library though.
This might give you a good sense of what you might do the same or even figure out a better way. I won't explain all the code, as there's a lot happening, but you should be able to guide through it. Let me know if you have any questions.
/**
* Update follower list for each oAuth
*
* #return response
*/
public function updateFollowers()
{
TwitterOAuth::chunk(200, function ($oauths)
{
foreach ($oauths as $oauth)
{
$page_id = $oauth->page_id;
$follower_list = TwitterFollowerList::where('page_id', $page_id)->first();
if (!$follower_list || $follower_list->updated_at < Carbon::now()->subMinutes(15))
{
$next_cursor = isset($follower_list->next_cursor) ? $follower_list->next_cursor : -1;
$ids = isset($follower_list->follower_ids) ? $follower_list->follower_ids : [];
$twitter = new TwitterApi($oauth->oauth_token, $oauth->oauth_token_secret);
$results = $twitter->get("followers/ids", ["user_id" => $page_id, "cursor" => $next_cursor]);
if (isset($results->errors)) continue;
$ids = $results->ids;
if ($results->next_cursor !== 0)
{
$ticks = 0;
do
{
if ($ticks === 13)
{
$ticks = 0;
break;
}
$ticks++;
$results = $twitter->get("followers/ids", ["user_id" => $page_id, "cursor" => $results->next_cursor]);
if (!$results) break;
$more_ids = $results->ids;
$ids = array_merge($ids, $more_ids);
}
while ($results->next_cursor > 0);
}
$stats = [
'page_id' => $page_id,
'follower_count' => count($ids),
'follower_ids' => $ids,
'next_cursor' => ($results->next_cursor > 0) ? $results->next_cursor : null,
'updated_at' => Carbon::now()
];
TwitterFollowerList::updateOrCreate(['page_id' => $page_id], $stats);
TwitterFollowersDaily::updateOrCreate([
'page_id' => $page_id,
'date' => Carbon::now()->toDateString()
],
[
'page_id' => $page_id,
'date' => Carbon::now()->toDateString(),
'follower_count' => count($ids),
]
);
continue;
}
}
});
}

Stripe API: List all Charges

I am using https://stripe.com/docs/api?lang=php#list_charges to get List all Charges but here they specify
count optional — default is 10 A limit on the number of objects to be
returned. Count can range between 1 and 100 items.
and I have thousands of entries, now how can I get all. Though if I set count to 100 it returns 110 records.
You can use the offset argument.
Once you get the 100 transactions, then make another call by adding offset=100 in URL.
This will bring the next 100 transactions, then make offset=200 and so on.
Update:
offset parameter is partly deprecated: API changelog - 2015-09-23
$charges = \Stripe\Charge::all();
foreach ($charges->autoPagingIterator() as $charge) {
// Do something with $charge
}
Reference.
Yes I got it with offset we can get all records.
Here's a PHP example: \Stripe\Charge::all(array("limit" => 3, "offset" => 10));
A Ruby example:
Stripe::Charge.all(limit: 3, offset:3)
As good as the Stripe API docs are, they could be clearer on how to filter.
source: https://stripe.com/docs/api/php#list_charges, https://stripe.com/docs/api/ruby#list_charges
in case offset is deprecated
$result = [];
$created_at = strtotime($request->end_data);
//created_at should be today's date epoch. search google for epoch
$has_more = false;
$a = 0;
do{
print_r($a);
\Stripe\Stripe::setApiKey(env('STRIPE_SECRET'));
$temp = \Stripe\BalanceTransaction::all( array(
'limit' => 100,
'created' => array(
'lte' => $created_at,
)
));
$result = array_merge($temp->data,$result);
$created_at = $temp->data[99]->created_at;
//api returns a parameter has_more(boolean), which means there is more
//data or not so you can also put that in while condition, for ex.
// $has_more = $temp->has_more;
$a++;
}while($a < 5);
dd($result);
this worked for me i was able to get 500 records at once as $a < 5 the api hits 5 times and each time created parameter which is lte (less than equal) changes for each api request and return previous records than current request provide. also i am appending the result of each api call to another result array
Unfortunately you can't.
I can see where such a feature would be nice for accounting purposes or whatever, but it's generally a better user experience to implement some sort of paging when displaying copious amounts of data to the user.
If you need absolute control over how many records to display at a time, I would suggest setting up a webhook on the charge.succeeded event and store your charges locally.

Categories