I need to clear data from the class created in parse.com'm using CURL with PHP.
However, I have to put in a foreach so that data and it makes very heavy processing.
Is there any way to delete all rows automated class?
foreach($request->results as $item){
$params = array(
'className' => 'UltimaMilha',
'objectId' => 'QE0oL2R4X6,xc3rzlKF7U'
);
$request = $parse->delete($params);
}
Related
I am using the Firebase PHP Admin SDK: https://firebase-php.readthedocs.io/en/stable/realtime-database.html#update-specific-fields
Here is the example it gives to update specific fields:
$uid = 'some-user-id';
$postData = [
'title' => 'My awesome post title',
'body' => 'This text should be longer',
];
// Create a key for a new post
$newPostKey = $db->getReference('posts')->push()->getKey();
$updates = [
'posts/'.$newPostKey => $postData,
'user-posts/'.$uid.'/'.$newPostKey => $postData,
];
$db->getReference() // this is the root reference
->update($updates);
From that, I created a users class and in that I have an update function. Like so:
public function update() {
$data = array('users' => array('1' => 'David'));
$this->database->getReference()->update($data);
return true;
}
In my database I have this structure:
Now if I run that function $users->update();
It removes the other child and only leaves David. Like so:
How can I update only a specific value of a specified key without it overriding the other data?
There's nothing specific to PHP here. That's the way Realtime Database update operations work. If you want a minimal update, you have to target the deepest key that you want to update. In your case, since you're storing an array type object, the keys are the number indexes of the array items you've written. If you want to modify one of them, you need to build a reference that includes the child number you want update. In that case, none of the sibling values will be touched.
Try this instead:
$this->database->getReference('users')->update(array('1' => 'David'));
Notice here that the update is rooted at "users", and you're updating just the immediate child "1" of that.
The example on docs is a little bit hard to grasp as a beginner. I have made it simpler for you to understand.
Once you get the newPostKey, prepare the url for child and run the code. It will only change the specific fields.
$ref = 'users/' . $newPostKey;
$updates = [
'title' => 'Updated title',
'body' => 'Updated body text',
];
$set = $db->getReference($ref)->update($updates);
I'm working on a project that connects to an external API. I have already made the connection and I've implemented several functions to retrieve the data, that's all working fine.
The following function however, works exactly like it should, only it slows down my website significantly ( 25 seconds + ).
Is this because of the nested foreach loop? And what can i do to refactor the code?
/**
* #param $acti
*/
function getBijeenkomstenFromAct ($acti) {
$acties = array();
foreach ($acti as $act) {
$bijeenkomsten = $this->getBijeenkomstenFromID($act['id']);
if (in_array('Doorlopende activiteit', $act['type'])) {
foreach ($bijeenkomsten as $bijeenkomst) {
$acties[] = array(
'id' => $act['id'],
'Naam' => $act['titel'],
'interval' => $act['interval'],
'activiteit' => $bijeenkomst['activiteit'],
'datum' => $bijeenkomst['datum']
);
}
} else {
$acties[] = array (
'id' => $act['id'],
'type' => $act['type'],
'activiteit' => $act['titel'],
'interval' => $act['interval'],
'dag' => $act['dag'],
'starttijd' => $act['starttijd'],
'eindtijd' => $act['eindtijd']
);
}
}
return $acties;
}
The function "getBijeenkomstenfromID" is working fine and on it's own not slow at all. Just to be sure, here is the function:
/**
* #param $activitieitID
*
* #return mixed
*
*/
public function getBijeenkomstenFromID($activitieitID) {
$options = array(
//'datumVan' => date('Y-m-d'),
'activiteit' => array (
'activiteit' => $activitieitID
),
'limit' => 5,
'datumVan' => date(("Y-m-d"))
);
$bijeenkomsten = $this->webservice->activiteitBijeenkomstOverzicht($this->api_key, $options);
return $bijeenkomsten;
}
It looks like you're calling on the API from within the first foreach loop, which is not efficient.
Every time you do this:
$bijeenkomsten = $this->getBijeenkomstenFromID($act['id']);
you're adding a lot of "dead" time to your script since you have to put on with network latency, the time you need to allow for the API to actually do the work and transmit it back to you. Even though this may be quick (let's say 100ms total), if your first foreach loop iterates 100 times, you already have accumulated 10 seconds of waiting, and that's before getBijeenkomstenFromAct ($acti) has done any real processing.
The best practice here would be to split this if possible. My suggestion:
Make getBijeenkomstenFromID($activitieitID) run asynchronously on its own for all the IDs you need to lookup in the API. The key here is for it to run as a separate process and then have it pass the array it constructs to getBijeenkomstenFromAct so that it can loop and process it happily.
So yes, basically I'm suggestion that you orchestrate your process backwards for efficiency's sake
Look into curl_multi: http://php.net/manual/en/function.curl-multi-exec.php
It will let you call an external API asynchronously and process the returns all at once. Be aware that APIs often have their own limitations on asynchronous calls, and common sense dictates that you probably shouldn't be hammering a website with 200 separate calls. But if your number of calls is under a dozen or two (and the API allows it), curl_multi will do nicely.
I am saving a complex dataset in Laravel 4.2 and I am looking for ways to improve this.
$bits has several $bobs. A single $bob can be one of several different classes. I am trying to duplicate a singular $bit and all its associated $bobs and save all of this to the DB with as few calls as possible.
$newBit = $this->replicate();
$newBit->save();
$bobsPivotData = [];
foreach ($this->bobs as $index => $bob) {
$newBob = $bob->replicate();
$newBobs[] = $newBob->toArray();
$bobsPivotData[] = [
'bit_id' => $newBit->id,
'bob_type' => get_class($newBob),
'bob_id' => $newBob->id,
'order' => $index
];
}
// I now want to save all the $bobs kept in $newBobs[]
DB::table('bobs')->insert($newBobs);
// Saving all the pivot data in one go
DB::table('bobs_pivot')->insert($bobsPivotData);
My problem is here, that I cant access $newBob->id before I have inserted the $newBob after the loop.
I am looking for how best to reduce saves to the DB. My best guess is that if I can predict the ids that are going to be used, I can do all of this in one loop. Is there a way I can predict these ids?
Or is there a better approach?
You could insert the bobs first and then use the generated ids to insert the bits. This isn't a great solution in a multi-user environment as there could be new bobs inserted in the interim which could mess things up, but it could suffice for your application.
$newBit = $this->replicate();
$newBit->save();
$bobsPivotData = [];
foreach ($this->bobs as $bob) {
$newBob = $bob->replicate();
$newBobs[] = $newBob->toArray();
}
$insertId = DB::table('bobs')->insertGetId($newBobs);
$insertedBobs = DB::table('bobs')->where('id', '>=', $insertId);
foreach($insertedBobs as $index => $newBob){
$bobsPivotData[] = [
'bit_id' => $newBit->id,
'bob_type' => get_class($newBob),
'bob_id' => $newBob->id,
'order' => $index
];
}
// Saving all the pivot data in one go
DB::table('bobs_pivot')->insert($bobsPivotData);
I have not tested this, so some pseudo-code to be expected.
I want to update Magento articles through the Magento PHP SOAP API.
This request works fine:
$result = $soap->catalogProductUpdate($session_id, $res[0]['sku'], array(
'price' => '69,99'
));
But I want to do something like this:
$result = $soap->catalogProductUpdate($session_id, $res[0]['sku'], array(
$_POST['t0'] => '69,99'
));
The variable $_POST['t0'] contains the string that I want to update, in this case $_POST['t0'] = 'price'.
But it does not work. Any idea how to use post-variables as an array-key?
i want to merge some data in Elasticsearch, but every time it is replacing my previous data and not merging it.
Suppose when i new is created it should add with the previous data, not replacing previous data. So Suppose there is a user exists in the "update_field" named "Christofer" so when i array_merge($usernames) where $usernames contains one or couple of usernames it is always replacing previous data.
I am working on PHP.
$usernames= array ("Johanna", "Maria");
$doc = array();
$doc['update_field'] = array_merge($usernames);
$u_params = array();
$u_params['id'] = 'my_id';
$u_params['index'] = 'my_index';
$u_params['type'] = 'my_type';
$u_params['body'] = array('doc' => $doc);
$client->update($u_params);
For being more clear, as a example let's say in the usernames field there are couple of username exists- like - "Christofer", "Henrik", "Eric".
So now i want to add more user like - "Johanna", "Maria", ...
Now every time i merge and update documents it is replacing the data, like ("Christofer", "Henrik", "Eric") is getting replace by ("Johanna", "Maria").
I want them to be added not replaced.
Do any body knows how can i merge the new data, or just the new data in other process. Thanks in advanced.
You need to use partial update. Try this instead, i.e. you need to send a doc hash in the body with the fields to marge (i.e. update_fields):
$params = [
'index' => 'my_index',
'type' => 'my_type',
'id' => 'my_id',
'body' => [
'doc' => [
'update_field' => array_merge($usernames)
]
]
];
$client->update($params);
UPDATE
That's right, core values and arrays are getting replaced.
You may want to try scripted partial update then
$usernames= array ("Johanna", "Maria");
$script = array();
$script['script'] = 'ctx._source.update_field += new_value';
$script['params'] = array('new_value' => array_merge($usernames));
$u_params = array();
$u_params['id'] = 'my_id';
$u_params['index'] = 'my_index';
$u_params['type'] = 'my_type';
$u_params['body'] = $script;
$client->update($u_params);
And make sure that scripting is enabled in your elasticsearch.yml config file:
script.disable_dynamic: false