Skip first "n" keys in Amazon S3 ListObjectsV2 - php

I am able to list files from my bucket using
$contents = $this->s3client->listObjectsV2([
'Bucket' => $this->bucket_name,
'ContinuationToken' => $continuationToken,
'Prefix' => $prefix,
'Delimiter' => $delimiter
]);
return $contents;
By playing the MaxKeys, ContinuationToken and StartAfter, I am able to limit or get exactly the number of keys I need. However, I want to skip the first "n" keys. How can I achieve this?
E.g My bucket with MaxKeys = 5 returns [1, 2, 3, 4, 5].
I need it such that I need to skip first 2 keys and MaxKeys = 5 and returns [3, 4, 5].
I should be able to skip 2, 20, 200 or any number defined. StartAfter does not really help with it.

I had to do it the ugly way. Since every time I would have had to call the API Request, I made it fetch everything in the call and pushed to an array.
Then I discarded the number of items I did not need. That was the only way I came up with to do it with a single request instead of two request as mentioned in the comment above.
Yes it is a very ugly fix.

Related

How can I access a specific depth of an associative array with variable variables

I've been hitting my head on this problem for some time now. I am working on a piece of software that creates a tree from a MySQL result checking for changes between each row to decide where in the tree to put the new data.
I'm now at a dead end while trying to understand how I can dynamically tell PHP to address different parts of my array.
I though about using variable variables but it doesn't seem to be working. To make my life easier I tried to set up a test file in which to test this behavior and this is the result...
$array = [
0 => [
"name" => "test"
],
1 => [
"name" => "test",
"data" => [
"content" => 5
]
]
];
$ref = 'array["1"]["name"]';
echo $ref."\n";
echo $$ref;
Output
array["1"]["name"]
Notice: Undefined variable: array["1"]["name"] in P:\xampp\htdocs\assets\php\test.php on line 23
I was instead expecting something like test.
I would also like to mention that I've tried the ${} method but I doesn't affect the array, but instead adds the data to another variable those rare times it doesn't output an error.
Anyone can help? Thanks!
after thinking about the problem once again I came up with a work-around to achieve the intended result.
I decided to use references make with &$var.
I, so, decided to tweak the code to create an array of each step to do to arrive at the intended location, instead of a string. Example follows:
// Old method
$ref = 'array["1"]["name"];
// New method
$ref = ["1", "name"];
The code then follows by cycling through the array referencing the original array but slowly deeper...
// Referencing the original array
$referencedArray = &$array;
// Going one step at the time inside the nested array
foreach ($ref as $k => $v) {
$referencedArray = &$referencedArray[$rav];
}
I believe this solution could fit my case, but if you have any suggestions please let me know.

PHP syntax to use generic value as opposed to hard coded row number

I have a php function that allows user to generate a PDF certificate once all 8 modules are complete.
<?php if($sname['8']=="Yes"){ ?>
//show button to create PDF
<?php } ?>
The $sname['8'] refers to an actual table row, ie there are the following rows in DB: 1, 2, 3, 4, 5, 6, 7, 8 and each gets a 'Yes' or 'No' to indicate if passed.
But I am adding a new course which only needs 6 modules to complete. I can get the appropriate value for number of modules needed to complete in following var:
echo $courseDetails['modules']; //could be 8 or 6
Which could be 8 or 6 depending on course. So I need to change the following to use the $courseDetails['modules'] instead of the hard coded '8', but I don't understand the syntax for making this happen.
eg
//if($sname['8']=="Yes"){
if($sname['$courseDetails['modules']']=="Yes"){
if($sname[" . $courseDetails['modules'] . "]=="Yes"){
Any help appreciated.
I would suggest that you use a numeric array instead of your current associative version. Currently you use a string key (valued '1','2','3',..). Altghough possible with an associative array, using a numeric array would simplify the task of determining the array's length and handling its individual elements.
In order to find out how many criteria there are to pass you could use
$criteria=array('No','No','No','No','No','No'); // define the criteria
// Now, check how many elements there are:
$critcount=count($criteria) // = 6
However, you need to know that the first element in the array has the index 0 and not 1, so to check the sixth element you would do:
$num=6;
if(criteria[$num-1]=="Yes"){ /* do something */ }

Best / Most Efficient averages of multi-dimensional array PHP

I am working on a application that has the following the following:
Each month, an API returns a series of values depending on the data, so something like (These are updated everyday so the results are cached)
$data = array(
array("2016-02-03" => 3", "2016-02-04" => 4", "2016-02-05" => 1"),
array("2016-02-03" => 1", "2016-02-04" => 2", "2016-02-05" => 3"),
array("2016-02-03" => 60", "2016-02-04" => 18", "2016-02-05" => 3"),
);
What I am therefore trying to achieve is that the algorithm will take the first key ("2016-02-03") then iterate through all of the sub arrays and find the values for this key and then sums them up and calculates the average, finally add this to another array. This will continue until there are no more keys left.
The problem is, I could have a huge foreach loop and do it that way, but the problem is, there are over 40 values and all contain around 30 days worth of data so this would be inefficient.
Is there an alternative to solving this problem? One that won't be intensive and slow?
I can only imagine the sulotion is to run the server for as long as it takes. I also suggest that after each date match you add the value to your new array and unset the index to reduce memory and the time needed to loop through everything.
In your new array you can have the syntax:
[ "Year-month-day" => [...] ]
Where the dots will be all the values.

How can I modify a traditional cartesian product to reduce my memory overhead?

For my problem you're selecting up to 24 items from a pool of maybe 5-10,000 items. In other words we're generating configurations.
The number 24 comes from the item categories, each item is associated with a particular installation location, an item from location 1 cannot be installed in location 10, so I have arranged my associative array to organize the data in groups. Each item looks like:
$items[9][] = array("id" => "0", "2" => 2, "13" => 20);
Where the first parameter ( $item[9] ) tells you the location it is allowed in. If you want it's ok to think of the idea that you cannot install a tire in the spot for an exhaust pipe.
The items are stored in a mySQL database. The user can specify restrictions on the solution, for example, attribute 2 must have a final value of 25 or more. They can have multiple competing restrictions. The queries retrieve items that have any value for the attributes under consideration (unspecified attributes are stored but we don't do any calculations with them). The PHP script then prunes out any redundant choices (for example: if item 1 has an attribute value of 3 and item 2 has an attribute value of 5, in the absence of another
restriction you would never choose item 1).
After all the processing has occurred get an associative array that looks like:
$items[10][] = array("id" => "3", "2" => 2, "13" => 100);
$items[10][] = array("id" => "4", "2" => 3, "13" => 50);
$items[9][] = array("id" => "0", "2" => 2, "13" => 20);
$items[9][] = array("id" => "1", "2" => -1, "13" => 50);
I have posted a full example data set at this pastebin link. There is reason to believe I can be more restrictive on what I accept into the data set but even at a restriction of 2 elements per option there's a problem.
In the array() value, the id is the reference to the index of the item in the array, and the other values are attribute id and value pairs. So $items[10][] = array("id" => "3", "2" => 2, "13" => 100); means that in location 10 there is an item with id 3 which as a value of 2 in attribute 2 and a value of 100 in attribute 13. If it helps think of an item being identified by a pair eg (10,0) is item 0 in location 10.
I know I'm not being specific, there are 61 attributes and I don't think it changes the structure of the problem with what they represent. If we want, we can think of attribute 2 as weight and attribute 13 as cost. The problem the user wants solved might be to generate a configuration where the weight is 25 exactly and the cost is minimized.
Back of the envelope math says a rough estimate, if there were only 2 choices per location, is 2^24 choices x size of the record. Assuming a 32 bit integer could be encoded to represent a single record somehow, we're looking at 16,777,216 * 4 = 67,108,864 bytes of memory (utterly ignoring data structure overhead) and there is no reason to believe that either of these assumptions is going to be valid, though an algorithm with an upper memory bound in the realm of 67 megs would be an acceptable memory size.
There's no particular reason to stick to this representation, I used associative arrays since I have a variable number of attributes to use and figured that would allow me to avoid a large, sparse array. Above "2"=>2 actually means that filtered attribute with id #2 has a value of 2 and similarly attribute 13's value is 100. I'm happy to change my data structure to something more compact.
One thought I had was that I do have an evaluation criteria I can use to discard most of the intermediate configurations. As an example, I can compute 75 * "value of "2"" + 10 * "value of "13" to provide a relative weighting of the solutions. In other words, if there were no other restrictions on a problem, each value improvement by 1 of attribute 2 costs 75 and each value improvement of attribute 13 costs 10. Continuing the idea of a car part, think of it like buying a stock part and having a machinist modify it to our specifications.
One problem I see with discarding configurations too early is that the weighting function does not take into account restrictions such as "the final result must have a value of "2" that is at exactly 25". So it's fine if I have a full 24 element configuration, I can run through a loop of the restrictions, discard the solutions that don't match and then finally rank the remaining solutions by the function, but I'm not sure there's a valid line of thought that allows me to throw away solutions earlier.
Does anyone have any suggestions on how to move forward? Although a language agnostic solution is fine, I am implementing in PHP if there's some relevant language feature that might be useful.
I solved my issue with memory by performing a depth first cartesian product. I can weigh the solutions one at a time and retain some if I choose or simply output them as I am doing here in this code snippet.
The main inspiration for this solution came from the very concise answer on this question. Here is my code as it seems like finding a php depth first cartesian product algorithm is less than trivial.
function dfcartesian ( $input, $current, $index ) {
// sample use: $emptyArray = array();
// dfcartesian( $items, $emptyArray, 0 )
if ( $index == count( $input ) ) {
// If we have iterated over the entire space and are at the bottom
// do whatever is relevant to your problem and return.
//
// If I were to improve the solution I suppose I'd pass in an
// optional function name that we could pass data to if desired.
var_dump( $current );
echo '<br><br>';
return;
}
// I'm using non-sequential numerical indicies in an associative array
// so I want to skip any empty numerical index without aborting.
//
// If you're using something different I think the only change that
// needs attention is to change $index + 1 to a different type of
// key incrementer. That sort of issue is tackled at
// https://stackoverflow.com/q/2414141/759749
if ( isset ( $input[$index] ) ) {
foreach ( $input[$index] as $element ) {
$current[] = $element;
// despite my concern about recursive function overhead,
// this handled 24 levels quite smoothly.
dfcartesian( $input, $current, ( $index + 1 ) );
array_pop( $current );
}
} else {
// move to the next index if there is a gap
dfcartesian( $input, $current, ( $index + 1 ) );
}
}
I hope this is of use to someone else tackling the same problem.

PHP array memory consumption for unused offsets

sorry if this question has already been answered somewhere else, but I couldn't find it (possibly because I had a tough time phrasing my question properly).
I'm working with a double dimension array which is the result set from a db query. Ive got the array set up so the array's first index is the pk of the row array so the array would look like...
$array[345] = {'id' => 345,
'info1' => 'lorem',
'infor2' => 'ipsum'}
$array[448] = {'id' => 448,
'info1' => 'lorem',
'infor2' => 'ipsum'}
My question... I know the index's are being passed as integers. So, I'm thinking (perhaps incorrectly) that they are being treated as numerical offsets by the array (as opposed to associatively.) So, if the first index is 345, does the system automatically reserve space in memory for index's 0 through 344? The code all works perfectly, but I am wondering if this method is going to eat up a boatload of memory. Especially if I get to the point where there are only two arrays being stored at 322,343 and 554,324. Sorry if it's a dumb question, thanks for any answers.
No, arrays are hashmaps and keys dont equal offsets, e.g
$foo = array(0 => 'x', 1000 => 'y')
is two elements only. There is nothing reserved inbetween.

Categories