I have an end point that I can send a GET request to and it returns a different result depending on the limit and offset.
https://example-foo-bar.com?l=100&o=0 // returns the first 100 items.
I want to create a for loop (or a Nested for loop I assume) that returns a 100 items at a time, adding the result to an array each time, until the end of the response. I have the code for sending the curl request and storing the result, just struggling on the batch processing part.
Something like:
https://example-foo-bar.com?l=100&o=0
https://example-foo-bar.com?l=100&o=99
https://example-foo-bar.com?l=100&o=199
https://example-foo-bar.com?l=100&o=218 // end of response ?
I also know how many result there are in total, stored as $count;
I ended up with something like this but it doesn't feel like the best practice:
function testLoop(){
$limit = 100;
$count = getCount();
$j = ceil($count/$limit);
for ($i = 0; $i < $j; $i++){
$offset = $i*100;
echo 'https://example-foo-bar?l='.$limit.'&o='.$offset.'';
}
}
testLoop();
I am not sure if I understand the question correctly. But are you looking for something like that?
$offset = 0;
$limit = 100;
$run = true;
$result_array = array();
while($run) {
$result_array = array_merge($result_array, json_decode(file_get_contents("https://example-foo-bar.com?l=".$limit."&o=".$offset),true));
$offset = $offset + $limit;
if($offset == {somenumber}) {
$run = false;
}
}
Then use a cron job to call the php file
Create table 'schedule' and store the data id, link_name,offset and status column
set cron to execute every 10 minutes and take First an entry (one) which status =0
pass param to testLoop($limit) to call function. It may entire link of only offset =0, offset =99, offset =199 like that
After completed to update status=1 in schedule table.
After 10 minute cron Call Step1.
Best way to use Cron for such type of batch process you can also use php-resque
Related
I created a function inside a longer plug-in for shopware, which is supposed to create a random number for every row in the database that has a "NULL" value in the "vouchercode" column. Right now I replaced the for-loop condition with a fixed number, because I wanted to make sure the problem doesn't occur because of the for-loop condition.
The problem is, that the for-loop just has effect on the database once.
For instance: I have this table 's_plugin_tnev'. Inside of that table are 6 rows. 4 of these have "NULL" as value inside of the vouchercode column.
So as far as I understand my code. It should loop 5 times through the same table and every time update one of those "NULL"-value columns, meanwhile after every loop one of those "NULL"-value columns should be filled with a random number and therefore no longer be SELECTed nor UPDATEd by this for-loop.
Though as mentioned earlier this doesn't happen. The for loop just works once apparently.
Here is my code snippet:
public function generateCode()
{
//Repeat action 5 times
for($i = 0; $i <= 4; $i++)
{
$rand = 0;
//Creates 16 times a number and add it to the var
for ($i = 0; $i<15; $i++)
{
$rand .= mt_rand(0,9);
}
//On Checkoutcomplete add $rand to database table
$addInt = "UPDATE s_plugin_tnev SET vouchercode = $rand
WHERE vouchercode IS NULL
LIMIT 1";
$connect = Shopware()->Db()->query($addInt);
}
}
As you can see I use the DBAL Framework, because this is the best supported way by Shopware.
My idea would be that the mistake has something to do with the $connect variable or that DBAL is not communicating fast enough with the Database.
Maybe someone has more experience with DBAL and could help me out.
Thanks in advance,
Max K
You have two for loops with $i, so on your first iteration, at the end the $i value is 15 and the first loop is executed only once.
Try this instead :
public function generateCode()
{
//Repeat action 5 times
for($i = 0; $i <= 4; $i++)
{
$rand = 0;
//Creates 16 times a number and add it to the var
for ($j = 0; $j<15; $j++) // $j NOT $i <---
{
$rand .= mt_rand(0,9);
}
//On Checkoutcomplete add $rand to database table
$addInt = "UPDATE s_plugin_tnev SET vouchercode = $rand
WHERE vouchercode IS NULL
LIMIT 1";
$connect = Shopware()->Db()->query($addInt);
}
}
I'm building a survey platform and I need to get the average answer rate of the survey.
What I'm currently doing is retrieving all the questions and then dividing times viewed and times answered. Is there a more efficient / less resource consuming method by calculating average on the DB and not looping through thousands of results?
Here is my working code right now that takes forever:
$total_showed = 0;
$total_answered = 0;
$total_queries = Query::where('client_app_id','=', $app_id)->get();
foreach ($total_queries as $app_query) {
$total_showed = $total_showed + $app_query->showed;
$total_answered = $total_answered + $app_query->answered;
}
if ($total_showed > 0) {
$total_arate = round(($total_answered / $total_showed) * 100, 1);
} else {
$total_arate = 0;
}
try
$total_showed = $total_queries->sum('showed')
$total_answered = $total_queries->sum('answered')
since $total_queries is a collection you can use it's sum method
see https://laravel.com/docs/5.3/collections#method-sum
this would be mre efficient I think
Sure you can go into Raw SQL:
instead of:
$total_queries = Query::where('client_app_id','=', $app_id)->get();
use something like:
$total_queries = Query::select(DB::raw('SUM(showed) as counter, SUM(answered) as answered'))
->where('client_app_id','=', $app_id)->get();
try this aggregate function avg(); like this
$price = DB::table('orders')->where('finalized', 1)
->avg('price')
I can't understand how and why my mysql_query command stops it's execution.
There are two arrays I work with:
routersTree (includes about 100 rows)
dates (includes about 30 cells)
Here the code:
while ($i <= count($routerTree)){
$currentRouter = $routerTree["router_$i"];
echo "<td>$i</td>";
for ($j = 0; $j < count($dates); $j++) {
$sql = "SELECT indications.id_device FROM indications LEFT JOIN routers ON indications.id_device = routers.id_device WHERE routers.id_router = $currentRouter[id_router] AND date(indications.dateField) = '$dates[$j]' ORDER BY routers.id_device";
if ($res = mysql_num_rows(mysql_query($sql))) {
echo "<td>$res</td>";
}
else {
echo "<td>error</td>";
}
}
}
It stops my cycle after 18-th row, but there are about 82 cycles more to do.
My guess is, that there is a small timeout for mysql_query command.
Any help would be appreciate.
Well, after continue finding the solution of my problem I finaly found one. So easy and so fast... The problem was in PHP timeout. I just added into my settings.php file next entry:
ini_set ('max_execution_time', 0);
The default value is 30 seconds. 0 means the infinite loop. But be careful with this thing. Rise the value for your needs, but try not to use infinite loop.
I'm trying to fetch 1000+ Twitter users from my database using this API call. However, Twitter lookup only allows 100 users per call, so how can I do it with 10 calls?
If I have 2232 users in my DB and I want to do a lookup on all users and get their details how to do it? Something which will count all the users being searched, break it into array of 100 elements, make the call for 100, and add the response back to database and then move onto the next 100.
I am using the tmhOAuth library for Twitter.
EDITED:
I was able to accomplish it using this code , but my next question is how can i bind those values back to my account ? because the screen_name is a entry and not the KEY of the array, so how can i do it ?
$accounts = $this->accounts->getAll();
$accounts_chunk = array_chunk($accounts,100);
foreach($accounts_chunk as $accounts){
$screen_names = "";
$last_key = end(array_keys($accounts));
foreach($accounts as $k => $account){
$screen_names .= $account->screen_name;
if($last_key == $k){
$screen_names .= "";
} else {
$screen_names .= ",";
}
}
$code = $this->twitter->request('GET', $this->twitter->url("1/users/lookup"),array('screen_name' => $screen_names));
echo "<pre>";print_r(json_decode($this->twitter->response));echo "</pre>";
}
But how to update values in DB using this .. i did a check but the sequence of the responses always changes so cannot use the current keys ..
You could loop through the max number of users and every hundredth time loop through hundred users and do your Twitter-magic there.
$iNumUsers = 2232; // or a mysql_num_rows-like result;
for($i = 0; $i < $iNumUsers; $i++) {
if($i % 100 == 0) {
for($j = $i; $j < 100; $j++) {
// your twitter-code here
}
}
}
Hi here are some simple steps to do this task
1: Get screen names from your db with limit of 100
2: impload with comma (join them with comma)
3: Send these 100 to users/lookup call and get data
4: (important) IF YOU RECEIVE AN ERROR OF "Rate limit exceeded" THEN USE PROXY
proxy will give you another chance to make next call of 100 users
5: decode json and send data to db
(important) if you use user's id instead of screen name then it will be easy to update db
Still have problem shout a comment here
The Twitter API says
You are strongly encouraged to use a POST for larger requests.
So try posting your 2,000 IDs to them.
With regards to the second part of your question
the sequence of the responses always changes so cannot use the current keys ..
Start with your array of user IDd - $ids
Get the response from Twitter as $tl
// Place the users into an array
$sortedUsers = array();
foreach ($tl as $user) {
$user_id = $user->id;
// $tl is *unsorted* - but $ids is *sorted*. So we place the users from $tl into a new array based on how they're sorted in $ids
$key = array_search($user_id, $ids);
$sortedUsers[$key] = $user;
}
// Sort the array by key so the most recent is at the top
ksort($sortedUsers);
I checked throught the existing topics. I have a fix for my problem but I know its not the right fix and I'm more interested making this work right, than creating a workaround it.
I have a project where I have 3 tables, diagnosis, visits, and treatments. People come in for a visit, they get a treatment, and the treatment is for a diagnosis.
For displaying this information on the page, I want to show the patient's diagnosis, then show the time they came in for a visit, that visit info can then be clicked on to show treatment info.
To do this a made this function in php:
<?
function returnTandV($dxid){
include("db.info.php");
$query = sprintf("SELECT treatments.*,visits.* FROM treatments LEFT JOIN visits ON
treatments.tid = visits.tid WHERE treatments.dxid = '%s' ORDER BY visits.dos DESC",
mysql_real_escape_string($dxid));
$result = mysql_query($query) or die("Failed because: ".mysql_error());
$num = mysql_num_rows($result);
for($i = 0; $i <= $num; ++$i) {
$v[$i] = mysql_fetch_array($result MYSQL_ASSOC);
++$i;
}
return $v;
}
?>
The function works and will display what I want which is all of the rows from both treatments and visits as 1 large assoc. array the problem is it always returns 1 less row than is actually in the database and I'm not sure why. There are 3 rows total, but msql_num_rows() will only show it as 2. My work around has been to just add 1 ($num = mysql_num_rows($result)+1;) but I would rather just have it be correct.
This section looks suspicious to me:
for($i = 0; $i <= $num; ++$i) {
$v[$i] = mysql_fetch_array($result MYSQL_ASSOC);
++$i;
}
You're incrementing i twice
You're going to $i <= $num when you most likely want $i < $num
This combination may be why you're getting unexpected results. Basically, you have three rows, but you're only asking for rows 0 and 2 (skipping row 1).
Programmers always count from 0. So, you are starting your loop at 0. If you end at 2, you have reached 3 rows.
Row0, Row1, Row2.
if $i = 0, and u increment it BEFORE adding something to the array, u skip the first row. increment $i AFTER the loop runs to start at 0 (first key).
For loops are not good for this: rather do:
$query=mysql_query(' --mysql --- ');
while ($row=mysql_fetch_array($query)){
$v[]=$row["dbcolumn"];
}
return $v for your function then.compact and neat. you can create an associative array, as long as the key name is unique (like primary ids).. $v["$priid"]=$row[1];