Optimize laravel project with extracting methods - php

I am working on a ongoing laravel project which uses laravel 5.1. There is a process that takes over 8 seconds. within that process, there is a method that has almost 250 lines of code.In optimization perspective, if I extracted that code into several methods, will it be faster than usual or will it take much time?

Breaking a big function into multiple small functions is not going to be faster instead it will become slower because of multiple function calls.
I would recommend to change your approach, use queues for async processes, etc.

With PHP there are usually a few things you can do to speed up code:
Use references where possible.
By default, whenever you call a function with parameters, a copy of the variables is made. If a function that takes arrays or strings as parameters is called frequently, this can be very expensive. So instead of doing things like:
function f($array) {
Use:
function f(&$array) {
This implies any modification to $array will affect the original array too, so it should be used only in situations where either $array isn't modified or you don't care that it is modified.
The same is good for foreach loops. Just make sure to unset() the variable after the loop is done to avoid problems.
foreach ($set as &$element) { ... }
unset($element);
Another trick is to previously allocate buffers instead of letting them expand automatically. For example, instead of:
$numbers = [1, 2, 3, 4, ... 10000000];
$inverted = [];
foreach ($numbers as $num) { $inverted[] = 1 / $num; }
You make sure $inverted already has the size required to fit all the elements:
$numbers = [1, 2, 3, 4, ... 10000000];
$inverted = array_fill(0, count($numbers), 0);
foreach ($numbers as $i => $num) { $inverted[$i] = 1 / $num; }
Likewise, avoid resizing strings and arrays whenever possible. Work with indexes etc.
Regular expressions, while convenient, are the enemy of performance. Avoid using them for simple tasks.
And all that of course considering the problem is with the PHP code. Maybe it isn't. If you are using a database, you might want to reassess how you perform your SQL queries. The way you structure them can impact performance severely.

Related

Best way to count coincidences in an Array in PHP

I have an array of thousands of rows and I want to know what is the best way or the best prectices to count the number of rows in PHP that have a coincidence on it.
In the example you can see that I can find the number of records that match with a range.
I´m thinking in this 2 options:
Option 1:
$rowCount = 0;
foreach ($this->data as $row) {
if ($row['score'] >= $rangeStart && $row['score'] <= $rangeEnd) {
$rowCount++;
}
}
return $rowCount;
Option 2:
$countScoreRange = array_filter($this->data, function($result) {
return $result['score'] >= $this->rangeStart && $result['score'] <= $this->rangeEnd;
});
return count($countScoreRange);
Thanks in advance.
it depends on what you mean when are you speaking about best practices?
if your idea about best practice is about performance, it can say that there is one tradeoff that you must care about"
**speed <===> memory**
if you need performance about memory :
in this way : when you think about performance in iterating an iterable object in PHP, you can Use YIELD to create a generator function , from PHP doc :
what is generator function ?
A generator function looks just like a normal function, except that instead of returning a value, a generator yields as many values as it needs to. Any function containing yield is a generator function.
why we must use generator function ?
A generator allows you to write code that uses foreach to iterate over a set of data without needing to build an array in memory, which may cause you to exceed a memory limit, or require a considerable amount of processing time to generate.
so it's better to dedicate a bit of memory instead of reserving an array of thousands
Unfortunately:
Unfortunately, PHP also does not allow you to use the array traversal functions on generators, including array_filter, array_map, etc.
so till here you know if :
you are iterating an array
of thousands element
especially if you use it in a function and the function runs many where.
and you care about performance specially in memory usage.
it's highly recommended to use generator functions instead.
** but if you need performance about speed :**
the comparison will be about 4 things :
for
foreach
array_* functions
array functions (just like nexT() , reset() and etc.)
The 'foreach' is slow in comparison to the 'for' loop. The foreach copies the array over which the iteration needs to be performed.
but you can do some tricks if you want to use it :
For improved performance, the concept of references needs to be used. In addition to this, ‘foreach’ is easy to use.
what about foreach and array-filter ?
as the previous answer said, also based on this article , also this : :
it is incorrect when we thought that array_* functions are faster!
Of course, if you work on the critical system you should consider this advice. Array functions are a little bit slower than basic loops but I think that there is no significant impact on performance.
Using foreach is much faster, and incrementing your count is also faster than using count().
I once tested both performance and foreach was about 3x faster than array_filter.
I'll go with the first option.

What is better in a foreach loop... using the & symbol or reassigning based on key?

Consider the following PHP Code:
//Method 1
$array = array(1,2,3,4,5);
foreach($array as $i=>$number){
$number++;
$array[$i] = $number;
}
print_r($array);
//Method 2
$array = array(1,2,3,4,5);
foreach($array as &$number){
$number++;
}
print_r($array);
Both methods accomplish the same task, one by assigning a reference and another by re-assigning based on key. I want to use good programming techniques in my work and I wonder which method is the better programming practice? Or is this one of those it doesn't really matter things?
Since the highest scoring answer states that the second method is better in every way, I feel compelled to post an answer here. True, looping by reference is more performant, but it isn't without risks/pitfalls.
Bottom line, as always: "Which is better X or Y", the only real answers you can get are:
It depends on what you're after/what you're doing
Oh, both are OK, if you know what you're doing
X is good for Such, Y is better for So
Don't forget about Z, and even then ...("which is better X, Y or Z" is the same question, so the same answers apply: it depends, both are ok if...)
Be that as it may, as Orangepill showed, the reference-approach offers better performance. In this case, the tradeoff one of performance vs code that is less error-prone, easier to read/maintan. In general, it's considered better to go for safer, more reliable, and more maintainable code:
'Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it.' — Brian Kernighan
I guess that means the first method has to be considered best practice. But that doesn't mean the second approach should be avoided at all time, so what follows here are the downsides, pitfalls and quirks that you'll have to take into account when using a reference in a foreach loop:
Scope:
For a start, PHP isn't truly block-scoped like C(++), C#, Java, Perl or (with a bit of luck) ECMAScript6... That means that the $value variable will not be unset once the loop has finished. When looping by reference, this means a reference to the last value of whatever object/array you were iterating is floating around. The phrase "an accident waiting to happen" should spring to mind.
Consider what happens to $value, and subsequently $array, in the following code:
$array = range(1,10);
foreach($array as &$value)
{
$value++;
}
echo json_encode($array);
$value++;
echo json_encode($array);
$value = 'Some random value';
echo json_encode($array);
The output of this snippet will be:
[2,3,4,5,6,7,8,9,10,11]
[2,3,4,5,6,7,8,9,10,12]
[2,3,4,5,6,7,8,9,10,"Some random value"]
In other words, by reusing the $value variable (which references the last element in the array), you're actually manipulating the array itself. This makes for error-prone code, and difficult debugging. As opposed to:
$array = range(1,10);
$array[] = 'foobar';
foreach($array as $k => $v)
{
$array[$k]++;//increments foobar, to foobas!
if ($array[$k] === ($v +1))//$v + 1 yields 1 if $v === 'foobar'
{//so 'foobas' === 1 => false
$array[$k] = $v;//restore initial value: foobar
}
}
Maintainability/idiot-proofness:
Of course, you might say that the dangling reference is an easy fix, and you'd be right:
foreach($array as &$value)
{
$value++;
}
unset($value);
But after you've written your first 100 loops with references, do you honestly believe you won't have forgotten to unset a single reference? Of course not! It's so uncommon to unset variables that have been used in a loop (we assume the GC will take care of it for us), so most of the time, you don't bother. When references are involved, this is a source of frustration, mysterious bug-reports, or traveling values, where you're using complex nested loops, possibly with multiple references... The horror, the horror.
Besides, as time passes, who's to say that the next person working on your code won't foget about unset? Who knows, he might not even know about references, or see your numerous unset calls and deem them redundant, a sign of your being paranoid, and delete them all together. Comments alone won't help you: they need to be read, and everyone working with your code should be thoroughly briefed, perhaps have them read a full article on the subject. The examples listed in the linked article are bad, but I've seen worse, still:
foreach($nestedArr as &$array)
{
if (count($array)%2 === 0)
{
foreach($array as &$value)
{//pointless, but you get the idea...
$value = array($value, 'Part of even-length array');
}
//$value now references the last index of $array
}
else
{
$value = array_pop($array);//assigns new value to var that might be a reference!
$value = is_numeric($value) ? $value/2 : null;
array_push($array, $value);//congrats, X-references ==> traveling value!
}
}
This is a simple example of a traveling value problem. I did not make this up, BTW, I've come across code that boils down to this... honestly. Quite apart from spotting the bug, and understanding the code (which has been made more difficult by the references), it's still quite obvious in this example, mainly because it's a mere 15 lines long, even using the spacious Allman coding style... Now imagine this basic construct being used in code that actually does something even slightly more complex, and meaningful. Good luck debugging that.
side-effects:
It's often said that functions shouldn't have side-effects, because side-effects are (rightfully) considered to be code-smell. Though foreach is a language construct, and not a function, in your example, the same mindset should apply. When using too many references, you're being too clever for your own good, and might find yourself having to step through a loop, just to know what is being referenced by what variable, and when.
The first method hasn't got this problem: you have the key, so you know where you are in the array. What's more, with the first method, you can perform any number of operations on the value, without changing the original value in the array (no side-effects):
function recursiveFunc($n, $max = 10)
{
if (--$max)
{
return $n === 1 ? 10-$max : recursiveFunc($n%2 ? ($n*3)+1 : $n/2, $max);
}
return null;
}
$array = range(10,20);
foreach($array as $k => $v)
{
$v = recursiveFunc($v);//reassigning $v here
if ($v !== null)
{
$array[$k] = $v;//only now, will the actual array change
}
}
echo json_encode($array);
This generates the output:
[7,11,12,13,14,15,5,17,18,19,8]
As you can see, the first, seventh and tenth elements have been altered, the others haven't. If we were to rewrite this code using a loop by reference, the loop looks a lot smaller, but the output will be different (we have a side-effect):
$array = range(10,20);
foreach($array as &$v)
{
$v = recursiveFunc($v);//Changes the original array...
//granted, if your version permits it, you'd probably do:
$v = recursiveFunc($v) ?: $v;
}
echo json_encode($array);
//[7,null,null,null,null,null,5,null,null,null,8]
To counter this, we'll either have to create a temporary variable, or call the function tiwce, or add a key, and recalculate the initial value of $v, but that's just plain stupid (that's adding complexity to fix what shouldn't be broken):
foreach($array as &$v)
{
$temp = recursiveFunc($v);//creating copy here, anyway
$v = $temp ? $temp : $v;//assignment doesn't require the lookup, though
}
//or:
foreach($array as &$v)
{
$v = recursiveFunc($v) ? recursiveFunc($v) : $v;//2 calls === twice the overhead!
}
//or
$base = reset($array);//get the base value
foreach($array as $k => &$v)
{//silly combine both methods to fix what needn't be a problem to begin with
$v = recursiveFunc($v);
if ($v === 0)
{
$v = $base + $k;
}
}
Anyway, adding branches, temp variables and what have you, rather defeats the point. For one, it introduces extra overhead which will eat away at the performance benefits references gave you in the first place.
If you have to add logic to a loop, to fix something that shouldn't need fixing, you should step back, and think about what tools you're using. 9/10 times, you chose the wrong tool for the job.
The last thing that, to me at least, is a compelling argument for the first method is simple: readability. The reference-operator (&) is easily overlooked if you're doing some quick fixes, or try to add functionality. You could be creating bugs in the code that was working just fine. What's more: because it was working fine, you might not test the existing functionality as thoroughly because there were no known issues.
Discovering a bug that went into production, because of your overlooking an operator might sound silly, but you wouldn't be the first to have encountered this.
Note:
Passing by reference at call-time has been removed since 5.4. Be weary of features/functionality that is subject to changes. a standard iteration of an array hasn't changed in years. I guess it's what you could call "proven technology". It does what it says on the tin, and is the safer way of doing things. So what if it's slower? If speed is an issue, you can optimize your code, and introduce references to your loops then.
When writing new code, go for the easy-to-read, most failsafe option. Optimization can (and indeed should) wait until everything's tried and tested.
And as always: premature optimization is the root of all evil. And Choose the right tool for the job, not because it's new and shiny.
As far as performance is concerned Method 2 is better, especially if you either have a large array and/or are using string keys.
While both methods use the same amount of memory the first method requires the array to be searched, even though this search is done by a index the lookup has some overhead.
Given this test script:
$array = range(1, 1000000);
$start = microtime(true);
foreach($array as $k => $v){
$array[$k] = $v+1;
}
echo "Method 1: ".((microtime(true)-$start));
echo "\n";
$start = microtime(true);
foreach($array as $k => &$v){
$v+=1;
}
echo "Method 2: ".((microtime(true)-$start));
The average output is
Method 1: 0.72429609298706
Method 2: 0.22671484947205
If I scale back the test to only run ten times instead of 1 million I get results like
Method 1: 3.504753112793E-5
Method 2: 1.2874603271484E-5
With string keys the performance difference is more pronounced.
So running.
$array = array();
for($x = 0; $x<1000000; $x++){
$array["num".$x] = $x+1;
}
$start = microtime(true);
foreach($array as $k => $v){
$array[$k] = $v+1;
}
echo "Method 1: ".((microtime(true)-$start));
echo "\n";
$start = microtime(true);
foreach($array as $k => &$v){
$v+=1;
}
echo "Method 2: ".((microtime(true)-$start));
Yields performance like
Method 1: 0.90371179580688
Method 2: 0.2799870967865
This is because searching by string key has more overhead then the array index.
It is also worth noting that as suggested in Elias Van Ootegem's Answer to properly clean up after yourself you should unset the reference after the loop has completed. I.e. unset($v); And the performance gains should be measured against the loss in readability.
There are some minor performance differences, but they aren't going to have any significant effect.
I would choose the first option for two reasons:
It's more readable. This is a bit of a personal preference, but at first glance, it's not immediately obvious to me that $number++ is updating the array. By explicitly using something like $array[$i]++, it's much clearer, and less likely to cause confusion when you come back to this code in a year.
It doesn't leave you with a dangling reference to the last item in the array. Consider this code:
$array = array(1,2,3,4,5);
foreach($array as &$number){
$number++;
}
// ... some time later in an unrelated section of code
$number = intval("100");
// now unexpectedly, $array[4] == 100 instead of 6
I guess that depends. Do you care more about code readability/maintainability or minimizing memory usage. The second method would use slightly less memory, but I would honestly prefere the first usage, as assigned by reference in foreach definition does not seem to be commonplace practice in PHP.
Personally if I wanted to modify an array in place like this I would go with a third option:
array_walk($array, function(&$value) {
$value++;
});
The first method will be insignificantly slower, because each time it will go through the loop, it will assign a new value to the $number variable. The second method uses the variable directly so it doesn't need to assign a new value for each loop.
But, as I said, the difference is not significant, the main thing to consider is readability.
In my opinion, the first method makes more sense when you don't need to modify the value in the loop, the $number variable would only be read.
The second method makes more sense when you need to modify the $number variable often, as you don't need to repeat the key each time you want to modify it, and it is more readable.
Have you considered array_map? It is designed to change values inside arrays.
$array = array(1,2,3,4,5);
$new = array_map(function($number){
return $number++ ;
}, $array) ;
var_dump($new) ;
I'd choose #2, but it's a personal preference.
I disagree with the other answers, using references to array items in foreach loops is quite common, but it depends on the framework you're using. As always, try to follow existing coding conventions in your project or framework.
I also disagree with the other answers that suggest array_map or array_walk. These introduce the overhead of a function call for each array element. For small arrays, this won't be significant, but for large arrays, this will add a significant overhead for such a simple function. However, they are appropriate if you're performing more significant calculations or actions - you'll need to decide which to use depending on the scenario, perhaps by benchmarking.
Most of the answers interpreted your question to be about performance.
This is not what you asked. What you asked is:
I wonder which method is the better programming practice?
As you said, both do the same thing. Both work. In the end, better is often a matter of opinion.
Or is this one of those it doesn't really matter things?
I wouldn't go so far as to say it doesn't matter. As you can see there can be performance considerations for Method 1 and reference gotchas for Method 2.
I can say what matters more is readability and consistency. While there are dozens of ways to increment array elements in PHP, some look like line noise or code golf.
Ensuring your code is readable to future developers and you consistently apply your method of solving problems is a far better macro programming practice than whatever micro differences exist in this foreach code.

Exploding an array within a foreach loop parameter

foreach(explode(',' $foo) as $bar) { ... }
vs
$test = explode(',' $foo);
foreach($test as $bar) { ... }
In the first example, does it explode the $foo string for each iteration or does PHP keep it in memory exploded in its own temporary variable? From an efficiency point of view, does it make sense to create the extra variable $test or are both pretty much equal?
I could make an educated guess, but let's try it out!
I figured there were three main ways to approach this.
explode and assign before entering the loop
explode within the loop, no assignment
string tokenize
My hypotheses:
probably consume more memory due to assignment
probably identical to #1 or #3, not sure which
probably both quicker and much smaller memory footprint
Approach
Here's my test script:
<?php
ini_set('memory_limit', '1024M');
$listStr = 'text';
$listStr .= str_repeat(',text', 9999999);
$timeStart = microtime(true);
/*****
* {INSERT LOOP HERE}
*/
$timeEnd = microtime(true);
$timeElapsed = $timeEnd - $timeStart;
printf("Memory used: %s kB\n", memory_get_peak_usage()/1024);
printf("Total time: %s s\n", $timeElapsed);
And here are the three versions:
1)
// explode separately
$arr = explode(',', $listStr);
foreach ($arr as $val) {}
2)
// explode inline-ly
foreach (explode(',', $listStr) as $val) {}
3)
// tokenize
$tok = strtok($listStr, ',');
while ($tok = strtok(',')) {}
Results
Conclusions
Looks like some assumptions were disproven. Don't you love science? :-)
In the big picture, any of these methods is sufficiently fast for a list of "reasonable size" (few hundred or few thousand).
If you're iterating over something huge, time difference is relatively minor but memory usage could be different by an order of magnitude!
When you explode() inline without pre-assignment, it's a fair bit slower for some reason.
Surprisingly, tokenizing is a bit slower than explicitly iterating a declared array. Working on such a small scale, I believe that's due to the call stack overhead of making a function call to strtok() every iteration. More on this below.
In terms of number of function calls, explode()ing really tops tokenizing. O(1) vs O(n)
I added a bonus to the chart where I run method 1) with a function call in the loop. I used strlen($val), thinking it would be a relatively similar execution time. That's subject to debate, but I was only trying to make a general point. (I only ran strlen($val) and ignored its output. I did not assign it to anything, for an assignment would be an additional time-cost.)
// explode separately
$arr = explode(',', $listStr);
foreach ($arr as $val) {strlen($val);}
As you can see from the results table, it then becomes the slowest method of the three.
Final thought
This is interesting to know, but my suggestion is to do whatever you feel is most readable/maintainable. Only if you're really dealing with a significantly large dataset should you be worried about these micro-optimizations.
In the first case, PHP explodes it once and keeps it in memory.
The impact of creating a different variable or the other way would be negligible. PHP Interpreter would need to maintain a pointer to a location of next item whether they are user defined or not.
From the point of memory it will not make a difference, because PHP uses the copy on write concept.
Apart from that, I personally would opt for the first option - it's a line less, but not less readable (imho!).
Efficiency in what sense? Memory management, or processor? Processor wouldn't make a difference, for memory - you can always do $foo = explode(',', $foo)

Is it better call a function every time or store that value in a new variable?

I use often the function sizeof($var) on my web application, and I'd like to know if is better (in resources term) store this value in a new variable and use this one, or if it's better call/use every time that function; or maybe is indifferent :)
TLDR: it's better to set a variable, calling sizeof() only once. (IMO)
I ran some tests on the looping aspect of this small array:
$myArray = array("bill", "dave", "alex", "tom", "fred", "smith", "etc", "etc", "etc");
// A)
for($i=0; $i<10000; $i++) {
echo sizeof($myArray);
}
// B)
$sizeof = sizeof($myArray);
for($i=0; $i<10000; $i++) {
echo $sizeof;
}
With an array of 9 items:
A) took 0.0085 seconds
B) took 0.0049 seconds
With a array of 180 items:
A) took 0.0078 seconds
B) took 0.0043 seconds
With a array of 3600 items:
A) took 0.5-0.6 seconds
B) took 0.35-0.5 seconds
Although there isn't much of a difference, you can see that as the array grows, the difference becomes more and more. I think this has made me re-think my opinion, and say that from now on, I'll be setting the variable pre-loop.
Storing a PHP integer takes 68 bytes of memory. This is a small enough amount, that I think I'd rather worry about processing time than memory space.
In general, it is preferable to assign the result of a function you are likely to repeat to a variable.
In the example you suggested, the difference in processing code produced by this approach and the alternative (repeatedly calling the function) would be insignificant. However, where the function in question is more complex it would be better to avoid executing it repeatedly.
For example:
for($i=0; $i<10000; $i++) {
echo date('Y-m-d');
}
Executes in 0.225273 seconds on my server, while:
$date = date('Y-m-d');
for($i=0; $i<10000; $i++) {
echo $date;
}
executes in 0.134742 seconds. I know these snippets aren't quite equivalent, but you get the idea. Over many page loads by many users over many months or years, even a difference of this size can be significant. If we were to use some complex function, serious scalability issues could be introduced.
A main advantage of not assigning a return value to a variable is that you need one less line of code. In PHP, we can commonly do our assignment at the same time as invoking our function:
$sql = "SELECT...";
if(!$query = mysql_query($sql))...
...although this is sometimes discouraged for readability reasons.
In my view for the sake of consistency assigning return values to variables is broadly the better approach, even when performing simple functions.
If you are calling the function over and over, it is probably best to keep this info in a variable. That way the server doesn't have to keep processing the answer, it just looks it up. If the result is likely to change, however, it will be best to keep running the function.
Since you allocate a new variable, this will take a tiny bit more memory. But it might make your code a tiny bit more faster.
The troubles it bring, could be big. For example, if you include another file that applies the same trick, and both store the size in a var $sizeof, bad things might happen. Strange bugs, that happen when you don't expect it. Or you forget to add global $sizeof in your function.
There are so many possible bugs you introduce, for what? Since the speed gain is likely not measurable, I don't think it's worth it.
Unless you are calling this function a million times your "performance boost" will be negligible.
I do no think that it really matters. In a sense, you do not want to perform the same thing over and over again, but considering that it is sizeof(); unless it is a enormous array you should be fine either way.
I think, you should avoid constructs like:
for ($i = 0; $i < sizeof($array), $i += 1) {
// do stuff
}
For, sizeof will be executed every iteration, even though it is often not likely to change.
Whereas in constructs like this:
while(sizeof($array) > 0) {
if ($someCondition) {
$entry = array_pop($array);
}
}
You often have no choice but to calculate it every iteration.

Efficiency of PHP arrays cast as objects?

From what I understand, PHP stdClass objects are generally faster than arrays, when the code is deeply-nested enough for it to actually matter. How is that efficiency affected if I'm typecasting to define stdClass objects on the fly:
$var = (object)array('one' => 1, 'two' => 2);
If the code doing this is going to be executed many many times, will I be better off explicitly defining $var as an objects instead:
$var = new stdClass();
$var->one = 1;
$var->two = 2;
Is the difference negligible since I'll then be accessing $var as an object from there on, either way?
Edit:
A stdClass is the datatype I need here. I'm not concerned with whether I should use arrays or whether I should use stdClass objects; I'm more concerned with whether using the (object)array(....) shorthand of instantiating a stdClass is efficient. And yes, this is in code that will be executed potentially thousands of times.
You understand wrong. Objects are not "much faster" than arrays. In fact, the opposite is usually true, since arrays don't have the problem of inheritance (and visibility lookups). Sure, there may be specific cases where you can show a clear significant gain/loss, but in the generic case arrays will tend to be faster...
Use the tool that's semantically correct. Don't avoid using arrays because you think objects are faster. Use a construct when it makes sense. There are times when it makes sense to replace an array with an object (For example, when you want to enforce the types in the array strictly). But this is not one of them. And even at that, you wouldn't replace array() with STDClass, you'd replace it with a custom class (one that likely extends ArrayObject or at least implements Iterator and ArrayAccess interfaces). Different Data Structures exist for a reason. If it was better to use objects instead of arrays, wouldn't we all be using objects instead?
Don't worry about micro-optimizations like "is cast faster". It's almost always not. Write readable, correct code. Then optimize if you're having a problem...
function benchmark($func_name, $iterations) {
$begin = microtime(true);
for ($i = 0; $i < $iterations; $i++) {
$func_name();
}
$end = microtime(true);
$execution_time = $end - $begin;
echo $func_name , ': ' , $execution_time;
}
function standClass() {
$obj = new stdClass();
$obj->param_one = 1;
$obj->param_two = 2;
}
function castFromArray() {
$obj = (object)array('param_one' => 1, 'param_two' => 2);
}
benchmark('standClass', 1000);
benchmark('castFromArray', 1000);
benchmark('standClass', 100000);
benchmark('castFromArray', 100000);
Outputs:
standClass: 0.0045979022979736
castFromArray: 0.0053138732910156
standClass: 0.27266097068787
castFromArray: 0.20209217071533
Casting from an array to stdClass on the fly is around 30% more efficient, but the difference is still negligible until you know you will be performing the operation 100,000 times (and even then, you're only looking at a tenth of a second, at least on my machine).
So, in short, it doesn't really matter the vast majority of the time, but if it does, define the array in a single command and then type-cast it to an object. I definitely wouldn't spend time worrying about it unless you've identified the code in question as a bottleneck (and even then, focus on reducing your number of iterations if possible).
It's more important to chose the right data structure for the right job.
When you want to decide what to use, ask yourself: What does your data represent?
Are you representing a set, or a sequence? Then you should probably use an array.
Are you representing some thing with certain properties or behaviors? Then you should probably use an object.

Categories