Optimizing array merge operation - php

I would appreciate any help, given.
I have 7 separate arrays with approx. 90,000 numbers in each array (let's call them arrays1-arrays7). There are no duplicate numbers within each array itself. BUT, there can be duplicates between the arrays. for example, array2 has no duplicates but it is possible to have numbers in common with arrays3 and arrays4.
The Problem:
I am trying to identify all of the numbers that are duplicated 3 times once all 7 arrays are merged.
I must do this calculation 1000 times and it takes 15 mins but that is not ok because I have to run it 40 times -- The code:
if you know of another language that is best suited for this type of calculation please let me know. any extension suggestions such as redis or gearman are helpful.
for($kj=1; $kj<=1000; $kj++)
{
$result=array_merge($files_array1,$files_array2,$files_array3,$files_array4,$files_array5,$files_array6,$files_array7);
$result=array_count_values($result);
$fp_lines = fopen("equalTo3.txt", "w");
foreach($result as $key => $val)
{
if($result[$key]==3)
{
fwrite($fp_lines, $key."\r\n");
}
}
fclose($fp_lines);
}
i have also tried the code below with strings but the array_map call and the array_count values call take 17 mins:
for($kj=1; $kj<=1000; $kj++)
{
$result='';
for ($ii = 0; $ii< 7; $ii++) {
$result .= $files_array[$hello_won[$ii]].'\r\n';
}
$result2=explode("\n",$result);//5mins
$result2=array_map("trim",$result2);//11mins
$result2=array_count_values($result2);//4-6mins
$fp_lines = fopen("equalTo3.txt", "w");
foreach($result2 as $key => $val)
{
if($result2[$key]==3)
{
fwrite($fp_lines, $key."\r\n");
}
}
fclose($fp_lines);
unset($result2);

array_merge() is significantly slower with more elements in the array because (from php.net):
If the input arrays have the same string keys, then the later value
for that key will overwrite the previous one. If, however, the arrays
contain numeric keys, the later value will not overwrite the original
value, but will be appended.
Values in the input array with numeric keys will be renumbered with
incrementing keys starting from zero in the result array.
So this function is actually making some conditional statements. You can replace array merge with normal adding, consisting of the loop (foreach or any other) and the [] operator. You can write a function imitating array_merge, like(using reference to not copy the array..):
function imitateMerge(&$array1, &$array2) {
foreach($array2 as $i) {
$array1[] = $i;
}
}
And you will see the increase of speed really hard.

I suggest replacing
foreach($result as $key => $val)
{
if($result[$key]==3)
{
fwrite($fp_lines, $key."\r\n");
}
}
With something like
$res = array_keys(array_filter($result, function($val){return $val == 3;}));
fwrite($fp_lines, implode("\r\n", $res));

This is probably all wrong, look at last edit
I also think array_merge is the problem, but my suggestion would be to implement
a function counting the values in several array directly instead of merging first.
This depends a little bit on how much overlap you have in your arrays. If the overlap
is very small then this might not be much faster then merging, but with significant
overlap (rand(0, 200000) to fill arrays when i tried) this will be much faster.
function arrValues($arrs) {
$values = array();
foreach($arrs as $arr) {
foreach($arr as $key => $val) {
if(array_key_exists($key, $values)) {
$values[$val]++;
} else {
$values[$val] = 1;
}
}
}
return $values;
}
var_dump(arrValues(array
($files_array1
,$files_array2
,$files_array3
,$files_array4
,$files_array5
,$files_array6
,$files_array7
)));
The computation takes about 0.5s on my machine, then another 2s for printing the stuff.
-edit-
It's also not clear to me why you do the same thing 1000 times? Are the arrays different
each time or something? Saying a bit about the reason might give people additional ideas...
-edit again-
After some more exploration I don't believe array_merge is at fault any more. You don't
have enough overlap to benefit that much from counting everything directly. Have you
investigated available memory on your machine? For me merging 7 arrays with 90k elements
in each takes ~250M. If you have allowed php to use this much memory, which I assume you
have since you do not get any allocation errors then maybe the problem is that the memory
is simply not available and you get a lot of page faults? If this is not the problem then
on what kind of machine and what php version are you using? I've tested your
original code on 5.5 and 5.4 and fixing the memory issue it also runs in about 0.5s. That
is per iteration mind you. Now if you do this a 1000 times in the same php script then
it will take a while. Even more so considering that you allocate all this memory each time.
I believe you really should consider putting stuff in a database. Given your numbers it seems you have ~500M rows in total. That is an awful lot to handle in php. A database makes it easy.

Related

How do I pre-allocate memory for an array in PHP?

How do I pre-allocate memory for an array in PHP? I want to pre-allocate space for 351k longs. The function works when I don't use the array, but if I try to save long values in the array, then it fails. If I try a simple test loop to fill up 351k values with a range(), it works. I suspect that the array is causing memory fragmentation and then running out of memory.
In Java, I can use ArrayList al = new ArrayList(351000);.
I saw array_fill and array_pad but those initialize the array to specific values.
Solution:
I used a combination of answers. Kevin's answer worked alone, but I was hoping to prevent problems in the future too as the size grows.
ini_set('memory_limit','512M');
$foundAdIds = new \SplFixedArray(100000); # google doesn't return deleted ads. must keep track and assume everything else was deleted.
$foundAdIdsIndex = 0;
// $foundAdIds = array();
$result = $gaw->getAds(function ($googleAd) use ($adTemplates, &$foundAdIds, &$foundAdIdsIndex) { // use call back to avoid saving in memory
if ($foundAdIdsIndex >= $foundAdIds->count()) $foundAdIds->setSize( $foundAdIds->count() * 1.10 ); // grow the array
$foundAdIds[$foundAdIdsIndex++] = $googleAd->ad->id; # save ids to know which to not set deleted
// $foundAdIds[] = $googleAd->ad->id;
PHP has an Array Class with SplFixedArray
$array = new SplFixedArray(3);
$array[1] = 'test1';
$array[0] = 'test2';
$array[2] = 'test3';
foreach ($array as $k => $v) {
echo "$k => $v\n";
}
$array[] = 'fails';
gives
0 => test1
1 => test2
2 => test3
As other people have pointed out, you can't do this in PHP (well, you can create an array of fixed length, but that's not really want you need). What you can do however is increase the amount of memory for the process.
ini_set('memory_limit', '1024M');
Put that at the top of your PHP script and you should be ok. You can also set this in the php.ini file. This does not allocate 1GB of memory to PHP, but rather allows PHP to expand it's memory usage up to that point.
A couple of things to point out though:
This might not be allowed on some shared hosts
If you're using this much memory, you might need to have a look at how you're doing things and see if they can be done more efficiently
Look out for opportunities to clear out unneeded resources (do you really need to keep hold of $x that contains a huge object you've already used?) using unset($x);
The quick answer is: you can't
PHP is quite different from java.
You can make an array with specific values as you said, but you already know about them. You can 'fake' it by filling it with null values, but that's about the same to be honest.
So unless you want to just create one with array_fill and null (which is a hack in my head), you just can't.
(You might want to check your reasoning about the memory. Are you sure this isn't an XY-problem? As memory is limited by a number (max usage) I don't think the fragmentation would have much effect. Check what is taking your memory rather then try going down this road)
The closest you will get is using SplFixedArray. It doesn't preallocate the memory needed to store the values (because you can't pre-specify the type of values used), but it preallocates the array slots and doesn't need to resize the array itself as you add values.

for loop vs while loop vs foreach loop PHP

1st off I'm new to PHP. I have been using for loop,while loop,foreach loop in scripts. I wonder
which one is better for performance?
what's the criteria to select a loop?
which should be used when we loop inside another loop?
the code which I'm stuck with wondering which loop to be used.
for($i=0;$i<count($all);$i++)
{
//do some tasks here
for($j=0;$j<count($rows);$j++)
{
//do some other tasks here
}
}
It's pretty obvious that I can write the above code using while. Hope someone will help me out to figure out which loop should be better to be used.
which one is better for performance?
It doesn't matter.
what's the criteria to select a loop?
If you just need to walk through all the elements of an object or array, use foreach. Cases where you need for include
When you explicitly need to do things with the numeric index, for example:
when you need to use previous or next elements from within an iteration
when you need to change the counter during an iteration
foreach is much more convenient because it doesn't require you to set up the counting, and can work its way through any kind of member - be it object properties or associative array elements (which a for won't catch). It's usually best for readability.
which should be used when we loop inside another loop?
Both are fine; in your demo case, foreach is the simplest way to go.
which one is better for performance?
Who cares? It won't be significant. Ever. If these sorts of tiny optimizations mattered, you wouldn't be using PHP.
what's the criteria to select a loop?
Pick the one that's easiest to read and least likely to cause mistakes in the future. When you're looping through integers, for loops are great. When you're looping through a collection like an array, foreach is great, when you just need to loop until you're "done", while is great.
This may depend on stylistic rules too (for example, in Python you almost always want to use a foreach loop because that's "the way it's done in Python"). I'm not sure what the standard is in PHP though.
which should be used when we loop inside another loop?
Whichever loop type makes the most sense (see the answer above).
In your code, the for loop seems pretty natural to me, since you have a defined start and stop index.
Check http://www.phpbench.com/ for a good reference on some PHP benchmarks.
The for loop is usually pretty fast. Don't include your count($rows) or count($all) in the for itself, do it outside like so:
$count_all = count($all);
for($i=0;$i<$count_all;$i++)
{
// Code here
}
Placing the count($all) in the for loop makes it calculate this statement for each loop. Calculating the value first, and then using the calculation in the loop makes it only run once.
For performance it does not matter if you choose a for or a while loop, the number of iterations determine execution time.
If you know the number of iterations at forehand, choose a for loop. If you want to run and stop on a condition, use a while loop
for loop is more appropriate when you know in advance how many
iterations to perform
While loop is used in the opposite case(when you don't know how many
iterations are needed)
For-Each loop is best when you have to iterate over collections.
To the best of my knowledge, there is little to no performance difference between while loop and for loop i don't know about the for-each loop
Performance:
Easy enough to test. If you're doing something like machine learning or big data you should really look at something that's compiled or assembled and not interpreted though; if the cycles really matter. Here are some benchmarks between the various programming languages. It looks like do-while loop is the winner on my systems using PHP with these examples.
$my_var = "some random phrase";
function fortify($my_var){
for($x=0;isset($my_var[$x]);$x++){
echo $my_var[$x]." ";
}
}
function whilst($my_var){
$x=0;
while(isset($my_var[$x])){
echo $my_var[$x]." ";
$x++;
}
}
function dowhilst($my_var){
$x=0;
do {
echo $my_var[$x]." ";
$x++;
} while(isset($my_var[$x]));
}
function forstream(){
for($x=0;$x<1000001;$x++){
//simple reassignment
$v=$x;
}
return "For Count to $v completed";
}
function whilestream(){
$x=0;
while($x<1000001){
$v=$x;
$x++;
}
return "While Count to 1000000 completed";
}
function dowhilestream(){
$x=0;
do {
$v=$x;
$x++;
} while ($x<1000001);
return "Do while Count to 1000000 completed";
}
function dowhilestream2(){
$x=0;
do {
$v=$x;
$x++;
} while ($x!=1000001);
return "Do while Count to 1000000 completed";
}
$array = array(
//for the first 3, we're adding a space after every character.
'fortify'=>$my_var,
'whilst'=>$my_var,
'dowhilst'=>$my_var,
//for these we're simply counting to 1,000,000 from 0
//assigning the value of x to v
'forstream'=>'',
'whilestream'=>'',
'dowhilestream'=>'',
//notice how on this one the != operator is slower than
//the < operator
'dowhilestream2'=>''
);
function results($array){
foreach($array as $function=>$params){
if(empty($params)){
$time= microtime();
$results = call_user_func($function);
} elseif(!is_array($params)){
$time= microtime();
$results = call_user_func($function,$params);
} else {
$time= microtime();
$results = call_user_func_array($function,$params);
}
$total = number_format(microtime() - $time,10);
echo "<fieldset><legend>Result of <em>$function</em></legend>".PHP_EOL;
if(!empty($results)){
echo "<pre><code>".PHP_EOL;
var_dump($results);
echo PHP_EOL."</code></pre>".PHP_EOL;
}
echo "<p>Execution Time: $total</p></fieldset>".PHP_EOL;
}
}
results($array);
Criteria: while, for, and foreach are the major control structures most people use in PHP. do-while is faster than while in my tests, but largely underused in most PHP coding examples on the web.
for is count controlled, so it iterates a specific number of times; though it is slower in my own results than using a while for the same thing.
while is good when something might start out as false, so it can prevent something from ever running and wasting resources.
do-while at least once, and then until the condition returns false. It's a little faster than a while loop in my results, but it's going to run at least once.
foreach is good for iterating through an array or object. Even though you can loop through a string with a for statement using array syntax you can't use foreach to do it though in PHP.
Control Structure Nesting: It really depends on what you're doing to determine while control structure to use when nesting. In some cases like Object Oriented Programming you'll actually want to call functions that contain your control structures (individually) rather than using massive programs in procedural style that contain many nested controls. This can make it easier to read, debug, and instantiate.

Does unsetting array values during iterating save on memory?

This is a simple programming question, coming from my lack of knowledge of how PHP handles array copying and unsetting during a foreach loop. It's like this, I have an array that comes to me from an outside source formatted in a way I want to change. A simple example would be:
$myData = array('Key1' => array('value1', 'value2'));
But what I want would be something like:
$myData = array([0] => array('MyKey' => array('Key1' => array('value1', 'value2'))));
So I take the first $myData and format it like the second $myData. I'm totally fine with my formatting algorithm. My question lies in finding a way to conserve memory since these arrays might get a little unwieldy. So, during my foreach loop I copy the current array value(s) into the new format, then I unset the value I'm working with from the original array. E.g.:
$formattedData = array();
foreach ($myData as $key => $val) {
// do some formatting here, copy to $reformattedVal
$formattedData[] = $reformattedVal;
unset($myData[$key]);
}
Is the call to unset() a good idea here? I.e., does it conserve memory since I have copied the data and no longer need the original value? Or, does PHP automatically garbage collect the data since I don't reference it in any subsequent code?
The code runs fine, and so far my datasets have been too negligible in size to test for performance differences. I just don't know if I'm setting myself up for some weird bugs or CPU hits later on.
Thanks for any insights.
-sR
Use a reference to the variable in the foreach loop using the & operator. This avoids making a copy of the array in memory for foreach to iterate over.
edit: as pointed out by Artefacto unsetting the variable only decreases the number of references to the original variable, so the memory saved is only on pointers rather than the value of the variable. Bizarrely using a reference actually increases the total memory usage as presumably the value is copied to a new memory location instead of being referenced.
Unless the array is referenced,
foreach operates on a copy of the
specified array and not the array
itself. foreach has some side effects
on the array pointer. Don't rely on
the array pointer during or after the
foreach without resetting it.
Use memory_get_usage() to identify how much memory you are using.
There is a good write up on memory usage and allocation here.
This is useful test code to see memory allocation - try uncommenting the commented lines to see total memory usage in different scenarios.
echo memory_get_usage() . PHP_EOL;
$test = $testCopy = array();
$i = 0;
while ($i++ < 100000) {
$test[] = $i;
}
echo memory_get_usage() . PHP_EOL;
foreach ($test as $k => $v) {
//foreach ($test as $k => &$v) {
$testCopy[$k] = $v;
//unset($test[$k]);
}
echo memory_get_usage() . PHP_EOL;
I was running out of memory while processing lines of a text (xml) file within a loop. For anyone with a similar situation, this worked for me:
while($data = array_pop($xml_data)){
//process $data
}
Please remember the rules of Optimization Club:
The first rule of Optimization Club is, you do not Optimize.
The second rule of Optimization Club is, you do not Optimize without measuring.
If your app is running faster than the underlying transport protocol, the optimization is over.
One factor at a time.
No marketroids, no marketroid schedules.
Testing will go on as long as it has to.
If this is your first night at Optimization Club, you have to write a test case.
Rules #1 and #2 are especially relevant here. Unless you know that you need to optimize, and unless you have measured that need to optimize, then don't do it. Adding the unset will add a run-time hit and will make future programmers why you are doing it.
Leave it alone.
If at any point in the "formatting" you do something like:
$reformattedVal['a']['b'] = $myData[$key];
Then doing unset($myData[$key]); is irrelevant memory-wise because you are only decreasing the reference count of the variable, which now exists in two places (inside $myData[$key] and $reformattedVal['a']['b']). Actually, you save the memory of indexing the variable inside the original array, but that's almost nothing.
Unless you're accessing the element by reference unsetting will do nothing whatsoever, as you can't alter the array during within the iterator.
That said, it's generally considered bad practice to modify the collection you're iterating over - a better approach would be to break down the source array into smaller chunks (by only loading a portion of the source data at a time) and process these, unsetting each entire array "chunk" as you go.

for vs foreach vs while which is faster for iterating through arrays in php

which one is the fastest for iterating through arrays in php? or does another exist which is also faster for iterating through arrays?
Even if there is any kind of difference, that difference will be so small it won't matter at all.
If you have, say, one query to the database, it'll take so long compared to the loop iterating over the results that the eternal debate of for vs foreach vs while will not change a thing -- at least if you have a reasonable amount of data.
So, use :
whatever you like
whatever fits your programming standard
whatever is best suited for your code/application
There will be plenty of other things you could/should optimize before thinking about that kind of micro-optimization.
And if you really want some numbers (even if it's just for fun), you can make some benchmark and see the results in practice.
For me I pick my loop based on this:
foreach
Use when iterating through an array whose length is (or can be) unknown.
for
Use when iterating through an array whose length is set, or, when you need a counter.
while
Use when you're iterating through an array with the express purpose of finding, or triggering a certain flag.
Now it's true, you can use a FOR loop like a FOREACH loop, by using count($array)... so ultimately it comes down to your personal preference/style.
In general there is no applicable speed differences between the three functions.
To provide benchmark results to demonstrate the efficiency of varying methods used to iterate over an array from 1 to 10,000.
Benchmark results of varying PHP versions: https://3v4l.org/a3Jn4
while $i++: 0.00077605247497559 sec
for $i++: 0.00073003768920898 sec
foreach: 0.0004420280456543 sec
while current, next: 0.024288892745972 sec
while reset, next: 0.012929201126099 sec
do while next: 0.011449098587036 sec //added after terminal benchmark
while array_shift: 0.36452603340149 sec
while array_pop: 0.013902902603149 sec
Takes into consideration individual calls for count with while and for
$values = range(1, 10000);
$l = count($values);
$i = 0;
while($i<$l){
$i++;
}
$l = count($values);
for($i=0;$i<$l;$i++){
}
foreach($values as $val){
}
The below examples using while, demonstrate how it would be used less efficiently during iteration.
When functionally iterating over an array and maintaining the current position; while becomes much less efficient, as next() and current() is called during the iteration.
while($val = current($values)){
next($values);
}
If the current positioning of the array is not important, you can call reset() or current() prior to iteration.
$value = reset($values);
while ($value) {
$value = next($values);
}
do ... while is an alternative syntax that can be used, also in conjunction with calling reset() or current() prior to iteration and by moving the next() call to the end of the iteration.
$value = current($values);
do{
}while($value = next($values));
array_shift can also be called during the iteration, but that negatively impacts performance greatly, due to array_shift re-indexing the array each time it is called.
while($values){
array_shift($values);
}
Alternatively array_reverse can be called prior to iteration, in conjunction with calling array_pop. This will avoid the impact from re-indexing when calling array_shift.
$values = array_reverse($values);
while($values) {
array_pop($values);
}
In conclusion, the speed of while, for, and foreach should not be the question, but rather what is done within them to maintain positioning of the array.
Terminal Tests run on PHP 5.6.20 x64 NTS CLI:
Correctly used, while is the fastest, as it can have only one check for every iteration, comparing one $i with another $max variable, and no additional calls before loop (except setting $max) or during loop (except $i++; which is inherently done in any loop statement).
When you start misusing it (like while(list..) ) you're better off with foreach of course, as every function call will not be as optimized as the one included in foreach (because that one is pre-optimized).
Even then, array_keys() gives you the same usability as foreach, still faster.
And beyond that, if you're into 2d-arrays, a home-made 2d_array_keys will enable you to use while all the way in a much faster way then foreach can be used (just try and tell the next foreach within the first foreach, that the last foreach had <2d_array_keys($array)> as keys --- ).
Besides, all questions related to first or last item of a loop using a while($i
And
while ($people_care_about_optimization!==true){
echo "there still exists a better way of doing it and there's no reason to use any other one";
}
Make a benchmark test.
There is no major "performance" difference, because the differences are located inside the logic.
You use foreach for array iteration,
without integers as keys.
You use for for array iteration with
integers as keys.
etc.
Remember that prefetching a lot of mysqli_result into a comfortable array can raise the question whether it is better to use for/foreach/while to cycle that array, but it's the wrong question about a bad solution that waste a lot of RAM.
So do not prefere this:
function funny_query_results($query) {
$results = $GLOBALS['mysqli']->query($query);
$rows = [];
while( $row = $results->fetch_object() ) {
$rows[] = $results;
}
return $rows;
}
$rows = funny_query_results("SELECT ...");
foreach($rows as $row) { // Uh... What should I use? foreach VS for VS while?
echo $row->something;
}
The direct way getting one-by-one every mysql_result in a simple while is a lot more optimized:
$results = $mysqli->query("SELECT ...");
while( $row = $results->fetch_object() ) {
echo $row->something;
}

The difference between loops

It's about PHP but I've no doubt many of the same comments will apply to other languages.
Simply put, what are the differences in the different types of loop for PHP? Is one faster/better than the others or should I simply put in the most readable loop?
for ($i = 0; $i < 10; $i++)
{
# code...
}
foreach ($array as $index => $value)
{
# code...
}
do
{
# code...
}
while ($flag == false);
For loop and While loops are entry condition loops. They evaluate condition first, so the statement block associated with the loop won't run even once if the condition fails to meet
The statements inside this for loop block will run 10 times, the value of $i will be 0 to 9;
for ($i = 0; $i < 10; $i++)
{
# code...
}
Same thing done with while loop:
$i = 0;
while ($i < 10)
{
# code...
$i++
}
Do-while loop is exit-condition loop. It's guaranteed to execute once, then it will evaluate condition before repeating the block
do
{
# code...
}
while ($flag == false);
foreach is used to access array elements from start to end. At the beginning of foreach loop, the internal pointer of the array is set to the first element of the array, in next step it is set to the 2nd element of the array and so on till the array ends. In the loop block The value of current array item is available as $value and the key of current item is available as $index.
foreach ($array as $index => $value)
{
# code...
}
You could do the same thing with while loop, like this
while (current($array))
{
$index = key($array); // to get key of the current element
$value = $array[$index]; // to get value of current element
# code ...
next($array); // advance the internal array pointer of $array
}
And lastly: The PHP Manual is your friend :)
This is CS101, but since no one else has mentioned it, while loops evaluate their condition before the code block, and do-while evaluates after the code block, so do-while loops are always guaranteed to run their code block at least once, regardless of the condition.
PHP Benchmarks
#brendan:
The article you cited is seriously outdated and the information is just plain wrong. Especially the last point (use for instead of foreach) is misleading and the justification offered in the article no longer applies to modern versions of .NET.
While it's true that the IEnumerator uses virtual calls, these can actually be inlined by a modern compiler. Furthermore, .NET now knows generics and strongly typed enumerators.
There are a lot of performance tests out there that prove conclusively that for is generally no faster than foreach. Here's an example.
I use the first loop when iterating over a conventional (indexed?) array and the foreach loop when dealing with an associative array. It just seems natural and helps the code flow and be more readable, in my opinion. As for do...while loops, I use those when I have to do more than just flip through an array.
I'm not sure of any performance benefits, though.
Performance is not significantly better in either case. While is useful for more complex tasks than iterating, but for and while are functionally equivalent.
Foreach is nice, but has one important caveat: you can't modify the enumerable you're iterating. So no removing, adding or replacing entries to/in it. Modifying entries (like changing their properties) is OK, of course.
With a foreach loop, a copy of the original array is made in memory to use inside. You shouldn't use them on large structures; a simple for loop is a better choice. You can use a while loop more efficiently on a large non-numerically indexed structure like this:
while(list($key, $value) = each($array)) {
But that approach is particularly ugly for a simple small structure.
while loops are better suited for looping through streams, or as in the following example that you see very frequently in PHP:
while ($row = mysql_fetch_array($result)) {
Almost all of the time the different loops are interchangeable, and it will come down to either a) efficiency, or b) clarity.
If you know the efficiency trade-offs of the different types of loops, then yes, to answer your original question: use the one that looks the most clean.
Each looping construct serves a different purpose.
for - This is used to loop for a specific number of iterations.
foreach - This is used to loop through all of the values in a collection.
while - This is used to loop until you meet a condition.
Of the three, "while" will most likely provide the best performance in most situations. Of course, if you do something like the following, you are basically rewriting the "for" loop (which in c# is slightly more performant).
$count = 0;
do
{
...
$count++;
}
while ($count < 10);
They all have different basic purposes, but they can also be used in somewhat the same way. It completely depends on the specific problem that you are trying to solve.
With a foreach loop, a copy of the original array is made in memory to use inside.
Foreach is nice, but has one important caveat: you can't modify the enumerable you're iterating.
Both of those won't be a problem if you pass by reference instead of value:
foreach ($array as &$value) {
I think this has been allowed since PHP 5.
When accessing the elements of an array, for clarity I would use a foreach whenever possible, and only use a for if you need the actual index values (for example, the same index in multiple arrays). This also minimizes the chance for typo mistakes since for loops make this all too easy. In general, PHP might not be the place be worrying too much about performance. And last but not least, for and foreach have (or should have; I'm not a PHP-er) the same Big-O time (O(n)) so you are looking possibly at a small amount more of memory usage or a slight constant or linear hit in time.
In regards to performance, a foreach is more consuming than a for
http://forums.asp.net/p/1041090/1457897.aspx

Categories