Which is faster in PHP - sprintf or double quoted strings? [duplicate] - php

I read somewehere (I thought on codinghorror) that it is bad practice to add strings together as if they are numbers, since like numbers, strings cannot be changed. Thus, adding them together creates a new string. So, I was wondering, what is the best way to add two strings together, when focusing on performance?
Which of these four is better, or is there another way which is better?
//Note that normally at least one of these two strings is variable
$str1 = 'Hello ';
$str2 = 'World!';
$output1 = $str1.$str2; //This is said to be bad
$str1 = 'Hello ';
$output2 = $str1.'World!'; //Also bad
$str1 = 'Hello';
$str2 = 'World!';
$output3 = sprintf('%s %s', $str1, $str2); //Good?
//This last one is probaply more common as:
//$output = sprintf('%s %s', 'Hello', 'World!');
$str1 = 'Hello ';
$str2 = '{a}World!';
$output4 = str_replace('{a}', $str1, $str2);
Does it even matter?

String Concatenation with a dot is definitely the fastest one of the three methods. You will always create a new string, whether you like it or not.
Most likely the fastest way would be:
$str1 = "Hello";
$str1 .= " World";
Do not put them into double-quotes like $result = "$str1$str2"; as this will generate additional overhead for parsing symbols inside the string.
If you are going to use this just for output with echo, then use the feature of echo that you can pass it multiple parameters, as this will not generate a new string:
$str1 = "Hello";
$str2 = " World";
echo $str1, $str2;
For more information on how PHP treats interpolated strings and string concatenation check out Sarah Goleman's blog.

You are always going to create a new string whe concatenating two or more strings together. This is not necessarily 'bad', but it can have performance implications in certain scenarios (like thousands/millions of concatenations in a tight loop). I am not a PHP guy, so I can't give you any advice on the semantics of the different ways of concatenating strings, but for a single string concatenation (or just a few), just make it readable. You are not going to see a performance hit from a low number of them.

Here's the quick and dirty test code, to understand the performance bottlenecks.
Single concat:
$iterations = 1000000;
$table = 'FOO';
$time = microtime(true);
for ($i = 0; $i < $iterations; $i++) {
$sql = sprintf('DELETE FROM `%s` WHERE `ID` = ?', $table);
}
echo 'single sprintf,',(microtime(true) - $time)."\n";
$time = microtime(true);
for ($i = 0; $i < $iterations; $i++) {
$sql = 'DELETE FROM `' . $table . '` WHERE `ID` = ?';
}
echo 'single concat,',(microtime(true) - $time)."\n";
$time = microtime(true);
for ($i = 0; $i < $iterations; $i++) {
$sql = "DELETE FROM `$table` WHERE `ID` = ?";
}
echo 'single "$str",',(microtime(true) - $time)."\n";
I get these results:
single sprintf,0.66322994232178
single concat,0.18625092506409 <-- winner
single "$str",0.19963216781616
Many concats (10):
$iterations = 1000000;
$table = 'FOO';
$time = microtime(true);
for ($i = 0; $i < $iterations; $i++) {
$sql = sprintf('DELETE FROM `%s`,`%s`,`%s`,`%s`,`%s`,`%s`,`%s`,`%s`,`%s`,`%s` WHERE `ID` = ?', $table, $table, $table, $table, $table, $table, $table, $table, $table, $table);
}
echo 'many sprintf,',(microtime(true) - $time)."\n";
$time = microtime(true);
for ($i = 0; $i < $iterations; $i++) {
$sql = 'DELETE FROM `' . $table . '`,`' . $table . '`,`' . $table . '`,`' . $table . '`,`' . $table . '`,`' . $table . '`,`' . $table . '`,`' . $table . '`,`' . $table . '`,`' . $table . '` WHERE `ID` = ?';
}
echo 'many concat,',(microtime(true) - $time)."\n";
$time = microtime(true);
for ($i = 0; $i < $iterations; $i++) {
$sql = "DELETE FROM `$table`,`$table`,`$table`,`$table`,`$table`,`$table`,`$table`,`$table`,`$table`,`$table` WHERE `ID` = ?";
}
echo 'many "$str",',(microtime(true) - $time)."\n";
Results:
many sprintf,2.0778489112854
many concats,1.535336971283
many "$str",1.0247709751129 <-- winner
As conclusion, it becomes obvious that single concat via dot (.) char is the fastest. And for cases, when you've got many concats, the best performing method is using direct string injection via "injection: $inject" syntax.

Unless its really large amount of text it really doesnt matter.

As the others said, $str1 . $str2 is perfectly OK in most cases, except in (big) loops.
Note that you overlook some solutions:
$output = "$str1$str2";
and for large number of strings, you can put them in an array, and use implode() to get a string out of them.
Oh, and "adding strings" sounds bad, or at least ambiguous. In most languages, we prefer to speak of string concatenation.

It doesn't matter unless used in a looong loop. In usual cases focus on code readability, even if you lost several processor cycles.
Example 1 and 2 are similar, I don't think there should be much difference, this would be the fastes of all. No. 1 might be slightly faster.
Example 3 will be slower, as sprintf format ('%s %s') needs to be parsed.
Example 4 does the replace, which involves searching within a string - additional thing to do, takes more time.
But firstly, is concatenating strings a performance problem? It's very unlikely, you should profile code to measure how much time does it take to run it. Then, replace the concatenating method with a different one and time again.
If you identify it as a problem, try googling for php string builder class (there are some to be found) or write your own.

Found this post from Google, and thought I'd run some benchmarks, as I was curious what the result would be. (Benchmarked over 10,000 iterations using a benchmarker that subtracts its own overhead.)
Which 2 strings 10 strings 50 strings
----------------------------------------------------------------
$a[] then implode() 2728.20 ps 6.02 μs 22.73 μs
$a . $a . $a 496.44 ps 1.48 μs 7.00 μs
$b .= $a 421.40 ps ★ 1.26 μs 5.56 μs
ob_start() and echo $a 2278.16 ps 3.08 μs 8.07 μs
"$a$a$a" 482.87 ps 1.21 μs ★ 4.94 μs ★
sprintf() 1543.26 ps 3.21 μs 12.08 μs
So there's not much in it. Probably good to avoid sprintf() and implode() if you need something to be screaming fast, but there's not much difference between all the usual methods.

there are 3 types of string joining operations.
Concatenate, take 2 string, allocate memory size length1+length2 and copy each into the new memory. quickest for 2 strings. However, concatenating 10 strings then requires 9 concat operations. The memory used is the 1st string 10 times, 2nd string 10 times, 3rd string 9 times, 4th string 8 times, etc. Runs X+1 +(X-1)*2 operations using more memory each cycle.
sprintf (array_merge, join, etc), take all the strings together, sum their length, allocate a new string of size sum, then copy each string into its respective place. memory used is 2*length of all initial strings, and operations is 2*X (each length, each copy)
ob (output buffer) allocate a generic 4k chunk and copies each string to it. memory 4k + each initial string, operations = 2 + X. (start, end, each copy)
Pick your poison.
OB is like using a memory atom bomb to join 2 small strings, but is very effective when there are many joins, loops, conditions or the additions are too dynamic for a clean sprintf.
concat is the most efficient to join a few fixed strings,
sprintf which works better for building a string out of fixed values at one time.
I don't know which routine php uses in this situation: "$x $y $z", might just be reduced to an inline $x . " " . $y . " " . $z

The advice you have read may have been related to the echo function, for which it's quicker to use commas, eg:
echo $str1, $str2;
Another approach is to build up a string in a variable (eg using the . operator) then echo the whole string at the end.
You could test this yourself using the microtime function (you'll need to make a loop that repeats eg 1,000 or 100,000 times to make the numbers significant). But of the four you posted, the first one is likely to be the fastest. It's also the most readable - the others don't really make sense programmatically.

This is not a solution for 2 strings, but when you thinking of joining more strings best way like that:
$tmp=srray();
for(;;) $tmp[]='some string';
$str=implode('',$tmp);
It's faster to create array element and join them all at once, than join them hundred times.

I'm not a PHP guru, however, in many other languages (e.g. Python), the fastest way to build a long string out of many smaller strings is to append the strings you want to concatenate to a list, and then to join them using a built-in join method. For example:
$result = array();
array_push("Hello,");
array_push("my");
array_push("name");
array_push("is");
array_push("John");
array_push("Doe.");
$my_string = join(" ", $result);
If you are building a huge string in a tight loop, the fastest way to do it is by appending to the array and then joining the array at the end.
Note: This entire discussion hinges on the performance of an array_push. You need to be appending your strings to a list in order for this to be effective on very large strings. Because of my limited exposure to php, I'm not sure if such a structure is available or whether php's array is fast at appending new elements.

For nearly 2 years after last post in this thread, I think that below solution may be the fastest for huge number of tight loops:
ob_start();
echo $str1;
echo $str2;
.
.
.
echo $str_n;
$finalstr = ob_get_clean();
This method ensures a flat storage of all strings and no processing or concatenation overhead. With the final code line, you get entire buffer as well. You can safely run loops instead of independent echos.

Related

Is there any difference in performance between assigning a long string to a variable and subsequently appending smaller strings to a variable?

As the title says, is there a difference in performance between:
$var = "Very long string to add";
and
$var = "Appending";
$var.= "subsequentially";
$var.= "short";
$var.= "strings";
I like to use the second method becouse i can keep the code more clean, but I'm worried multiple assignments could affect performance, is it something worth worrying about?
Is there a way to test it maybe?
Yes, there is a way to measure it ... see this function ... but hard to say how much precise this would be.
Test example is below.
<?php
$s1 = microtime(true);
$var1 = "Very long string to add";
$e1 = microtime(true);
$s2 = microtime(true);
$var2 = "Appending";
$var2.= "subsequentially";
$var2.= "short";
$var2.= "strings";
$e2 = microtime(true);
echo 'Big string: ' . number_format($e1 - $s1, 9, '.', '') . 's';
echo PHP_EOL;
echo 'Appended: ' . number_format($e2 - $s2, 9, '.', '') . 's';
Tests looked mostly like this:
Big string: 0.000001907s
Appended: 0.000002146s
Big string: 0.000000954s
Appended: 0.000003099s
Big string: 0.000000954s
Appended: 0.000002146s
Both solutions take up max 1 or 2 micro-seconds, so, ... yes, the appending technique is a little more performance-costly, but not so significantly that you should care :) ...
... unless you count every micro-second, then i would recommend you a language other than PHP.

PHP random string function preventing duplicate outputs when called multiple times

The following function will create a random string output what is the best way to prevent duplicate outputs / collisions if called multiple times.
function random_string($length) {
$key = '';
$keys = array_merge(range('A', 'Z'), range('a', 'z'), array('_'));
for ($i = 0; $i < $length; $i++) {
$key .= $keys[array_rand($keys)];
}
return $key;
}
echo "First : " . random_string(rand(3, 50));
//These have a small percentage of a chance of matching the previous random output lets eliminate all possibility of getting the same output.
echo "Second : " . random_string(rand(3, 50));
echo "Third : " . random_string(rand(3, 50));
echo "Fourth : " . random_string(rand(3, 50));
I did read on the PHP documentation that array_unique could achieve what i want but would it be the best solution or is there a more efficient way.
Here's a simple solution:
// array to store required strings
$stringsCollection = [];
// 4 is the number of strings that you need
while (sizeof($stringsCollection) != 4) {
// generate new string
$randString = random_string($length);
// if string is not in `stringsCollection` - add it as a key
if (!isset($stringsCollection[$randString])) {
$stringsCollection[$randString] = 1;
}
}
print_r(array_keys($stringsCollection));
or is there a more efficient way
You are heading to "overengeneering land", trying to fix things you cannot even name :) Is it slow? Sluggish? Then profile and fix relevant parts. "Efficient" is buzzword w/o defining what efficiency means for you.
what is the best way to prevent duplicate outputs / collisions if called multiple times
First, I'd use hashing functions with detailed (i.e. seconds or millis) timestamp + some mt_rand as input like instead. It's limited in length but you can call it multiple times and concatenate results (+ trim if needed). And if you want to be 1000% the value was never returned before, you must keep track of them, however you most likely can assume this is not going to happen if you will have input string long enough

PHP string direct access using str[index] vs splitting into an array

I'm iterating through each character in a string in PHP.
Currently I'm using direct access
$len=strlen($str);
$i=0;
while($i++<$len){
$char=$str[$i];
....
}
That got me pondering what is probably purely academic.
How does direct access work under the hood and is there a length of string that would see optimization in a character loop(micro though it may be) by splitting said string into an array and using the array's internal pointer to keep index location in memory?
TLDNR:
Would accessing each member of a 5 million item array be faster than accessing each character of a 5 million character string directly?
Accessing a string's bytes is faster by an order of magnitude. Why? PHP likely just has each array index referenced to the index where it is storing each byte in memory. So it likely just goes right to the location it needs to, reads in one byte of data, and it is done. Note that unless the characters are single-byte you will not actually get a usable character from accessing via string byte-array.
When accessing a potential multi-byte string (via mb_substr) a number of additional steps need to be taken in order to ensure the character is not more than one byte, how many bytes it is, then access each needed byte and return the individual [possibly multi-byte] character (notice there are a few extra steps).
So, I put together a simple test code just to show that array-byte access is orders of magnitude faster (but will not give you a usable character if it a multi-byte character exists as a given string's byte index). I grabbed the random character function from here ( Optimal function to create a random UTF-8 string in PHP? (letter characters only) ), then added the following:
$str = rand_str( 5000000, 5000000 );
$bStr = unpack('C*', $str);
$len = count($bStr)-1;
$i = 0;
$startTime = microtime(true);
while($i++<$len) {
$char = $str[$i];
}
$endTime = microtime(true);
echo '<pre>Array access: ' . $len . ' items: ', $endTime-$startTime, ' seconds</pre>';
$i = 0;
$len = mb_strlen($str)-1;
$startTime = microtime(true);
while($i++<$len) {
$char = mb_substr($str, $i, 1);
if( $i >= 100000 ) {
break;
}
}
$endTime = microtime(true);
echo '<pre>Substring access: ' . ($len+1) . ' (limited to ' . $i . ') items: ', $endTime-$startTime, ' seconds</pre>';
You will notice that the mb_substr loop I have restricted to 100,000 characters. Why? It just takes too darn long to run through all 5,000,000 characters!
What were my results?
Array access: 12670380 items: 0.4850001335144 seconds
Substring access: 5000000 (limited to 100000) items: 17.00200009346 seconds
Notice the string array access was able to filter through all 12,670,380 bytes -- yep, 12.6 MILLION bytes from 5 MILLION characters [many were multi-byte] -- in just 1/2 second while the mb_substring, limited to 100,000 characters, took 17 seconds!
The answer to your question is that your current method is highly likely the fastest way.
Why?
Since a string in php is just an array of bytes with one byte representing each character (when using UTF-8), there shouldn't be a theoretically faster form of array.
Moreover, any additional implementation of an array to which you'd copy the characters of your original string would add overhead and slow things down.
If your string is highly limited in its contents (for instance, only allowing 16 characters instead of 256), there may be faster implementations, but that seems like an edge case.
Quick answer (for non-multibyte strings which may have been what the OP was asking about, and useful to others as well): Direct access is still faster (by about a factor of 2). Here's the code, based on the accepted answer, but doing an apples-apples comparison of using substr() rather than mb_substr()
$str = base64_encode(random_bytes(4000000));
$len = strlen($str)-1;
$i = 0;
$startTime = microtime(true);
while($i++<$len) {
$char = $str[$i];
}
$endTime = microtime(true);
echo '<pre>Array access: ' . $len . ' items: ', $endTime-$startTime, ' seconds</pre>';
$i = 0;
$len = strlen($str)-1;
$startTime = microtime(true);
while($i++<$len) {
$char = substr($str, $i, 1);
}
$endTime = microtime(true);
echo '<pre>Substring access: ' . ($len) . ' items: ', $endTime-$startTime, ' seconds</pre>';
Note: used base64 coding of random numbers to create the random string, as rand_str was not a defined function. Maybe not exactly the most random, but certainly random enough for testing.
My results:
Array access: 5333335 items: 0.40552091598511 seconds
Substring access: 5333335 items: 0.87574410438538 seconds
Note: also tried to do a $chars = preg_split('//', $str, -1, PREG_SPLIT_NO_EMPTY); and iterating through $chars. Not only was this slower, but it ran out of space with a 5,000,000 character string

Performance of variable expansion vs. sprintf in PHP

Regarding performance, is there any difference between doing:
$message = "The request $request has $n errors";
and
$message = sprintf('The request %s has %d errors', $request, $n);
in PHP?
I would say that calling a function involves more stuff, but I do not know what's PHP doing behind the scenes to expand variables names.
Thanks!
It does not matter.
Any performance gain would be so minuscule that you would see it (as an improvement in the hundreths of seconds) only with 10000s or 100000s of iterations - if even then.
For specific numbers, see this benchmark. You can see it has to generate 1MB+ of data using 100,000 function calls to achieve a measurable difference in the hundreds of milliseconds. Hardly a real-life situation. Even the slowest method ("sprintf() with positional params") takes only 0.00456 milliseconds vs. 0.00282 milliseconds with the fastest. For any operation requiring 100,000 string output calls, you will have other factors (network traffic, for example) that will be an order of magniture slower than the 100ms you may be able to save by optimizing this.
Use whatever makes your code most readable and maintainable for you and others. To me personally, the sprintf() method is a neat idea - I have to think about starting to use that myself.
In all cases the second won't be faster, since you are supplying a double-quoted string, which have to be parsed for variables as well. If you are going for micro-optimization, the proper way is:
$message = sprintf('The request %s has %d errors', $request, $n);
Still, I believe the seconds is slower (as #Pekka pointed the difference actually do not matter), because of the overhead of a function call, parsing string, converting values, etc. But please, note, the 2 lines of code are not equivalent, since in the second case $n is converted to integer. if $n is "no error" then the first line will output:
The request $request has no error errors
While the second one will output:
The request $request has 0 errors
A performance analysis about "variable expansion vs. sprintf" was made here.
As #pekka says, "makes your code most readable and maintainable for you and others". When the performance gains are "low" (~ less than twice), ignore it.
Summarizing the benchmark: PHP is optimized for Double-quoted and Heredoc resolutions. Percentuals to respect of average time, to calculating a very long string using only,
double-quoted resolution: 75%
heredoc resolution: 82%
single-quote concatenation: 93%
sprintf formating: 117%
sprintf formating with indexed params: 133%
Note that only sprintf do some formating task (see benchmark's '%s%s%d%s%f%s'), and as #Darhazer shows, it do some difference on output. A better test is two benchmarks, one only comparing concatenation times ('%s' formatter), other including formatting process — for example '%3d%2.2f' and functional equivalents before expand variables into double-quotes... And more one benchmark combination using short template strings.
PROS and CONS
The main advantage of sprintf is, as showed by benchmarks, the very low-cost formatter (!). For generic templating I suggest the use of the vsprintf function.
The main advantages of doubled-quoted (and heredoc) are some performance; and some readability and maintainability of nominal placeholders, that grows with the number of parameters (after 1), when comparing with positional marks of sprintf.
The use of indexed placeholders are at the halfway of maintainability with sprintf.
NOTE: not use single-quote concatenation, only if really necessary. Remember that PHP enable secure syntax, like "Hello {$user}_my_brother!", and references like "Hello {$this->name}!".
I am surprised, but for PHP 7.* "$variables replacement" is the fastest approach:
$message = "The request {$request} has {$n} errors";
You can simply prove it yourself:
$request = "XYZ";
$n = "0";
$mtime = microtime(true);
for ($i = 0; $i < 1000000; $i++) {
$message = "The request {$request} has {$n} errors";
}
$ctime = microtime(true);
echo '
"variable $replacement timing": '. ($ctime-$mtime);
$request = "XYZ";
$n = "0";
$mtime = microtime(true);
for ($i = 0; $i < 1000000; $i++) {
$message = 'The request '.$request.' has '.$n.' errors';
}
$ctime = microtime(true);
echo '
"concatenation" . $timing: '. ($ctime-$mtime);
$request = "XYZ";
$n = "0";
$mtime = microtime(true);
for ($i = 0; $i < 1000000; $i++) {
$message = sprintf('The request %s has %d errors', $request, $n);
}
$ctime = microtime(true);
echo '
sprintf("%s", $timing): '. ($ctime-$mtime);
The result for PHP 7.3.5:
"variable $replacement timing": 0.091434955596924
"concatenation" . $timing: 0.11175799369812
sprintf("%s", $timing): 0.17482495307922
Probably you already found recommendations like 'use sprintf instead of variables contained in double quotes, it’s about 10x faster.' What are some good PHP performance tips?
I see it was the truth but one day. Namely before the PHP 5.2.*
Here is a sample of how it was those days PHP 5.1.6:
"variable $replacement timing": 0.67681694030762
"concatenation" . $timing: 0.24738907814026
sprintf("%s", $timing): 0.61580610275269
For Injecting Multiple String variables into a String, the First one will be faster.
$message = "The request $request has $n errors";
And For a single injection, dot(.) concatenation will be faster.
$message = 'The request '.$request.' has 0 errors';
Do the iteration with a billion loop and find the difference.
For eg :
<?php
$request = "XYZ";
$n = "0";
$mtime = microtime(true);
for ($i = 0; $i < 1000000; $i++) {
$message = "The request {$request} has {$n} errors";
}
$ctime = microtime(true);
echo ($ctime-$mtime);
?>
Ultimately the 1st is the fastest when considering the context of a single variable assignment which can be seen by looking at various benchmarks. Perhaps though, using the sprintf flavor of core PHP functions could allow for more extensible code and be better optimized for bytecode level caching mechanisms like opcache or apc. In other words, a particular sized application could use less code when utilizing the sprintf method. The less code you have to cache into RAM, the more RAM you have for other things or more scripts. However, this only matters if your scripts wouldn't properly fit into RAM using evaluation.

What is the best way to add two strings together?

I read somewehere (I thought on codinghorror) that it is bad practice to add strings together as if they are numbers, since like numbers, strings cannot be changed. Thus, adding them together creates a new string. So, I was wondering, what is the best way to add two strings together, when focusing on performance?
Which of these four is better, or is there another way which is better?
//Note that normally at least one of these two strings is variable
$str1 = 'Hello ';
$str2 = 'World!';
$output1 = $str1.$str2; //This is said to be bad
$str1 = 'Hello ';
$output2 = $str1.'World!'; //Also bad
$str1 = 'Hello';
$str2 = 'World!';
$output3 = sprintf('%s %s', $str1, $str2); //Good?
//This last one is probaply more common as:
//$output = sprintf('%s %s', 'Hello', 'World!');
$str1 = 'Hello ';
$str2 = '{a}World!';
$output4 = str_replace('{a}', $str1, $str2);
Does it even matter?
String Concatenation with a dot is definitely the fastest one of the three methods. You will always create a new string, whether you like it or not.
Most likely the fastest way would be:
$str1 = "Hello";
$str1 .= " World";
Do not put them into double-quotes like $result = "$str1$str2"; as this will generate additional overhead for parsing symbols inside the string.
If you are going to use this just for output with echo, then use the feature of echo that you can pass it multiple parameters, as this will not generate a new string:
$str1 = "Hello";
$str2 = " World";
echo $str1, $str2;
For more information on how PHP treats interpolated strings and string concatenation check out Sarah Goleman's blog.
You are always going to create a new string whe concatenating two or more strings together. This is not necessarily 'bad', but it can have performance implications in certain scenarios (like thousands/millions of concatenations in a tight loop). I am not a PHP guy, so I can't give you any advice on the semantics of the different ways of concatenating strings, but for a single string concatenation (or just a few), just make it readable. You are not going to see a performance hit from a low number of them.
Here's the quick and dirty test code, to understand the performance bottlenecks.
Single concat:
$iterations = 1000000;
$table = 'FOO';
$time = microtime(true);
for ($i = 0; $i < $iterations; $i++) {
$sql = sprintf('DELETE FROM `%s` WHERE `ID` = ?', $table);
}
echo 'single sprintf,',(microtime(true) - $time)."\n";
$time = microtime(true);
for ($i = 0; $i < $iterations; $i++) {
$sql = 'DELETE FROM `' . $table . '` WHERE `ID` = ?';
}
echo 'single concat,',(microtime(true) - $time)."\n";
$time = microtime(true);
for ($i = 0; $i < $iterations; $i++) {
$sql = "DELETE FROM `$table` WHERE `ID` = ?";
}
echo 'single "$str",',(microtime(true) - $time)."\n";
I get these results:
single sprintf,0.66322994232178
single concat,0.18625092506409 <-- winner
single "$str",0.19963216781616
Many concats (10):
$iterations = 1000000;
$table = 'FOO';
$time = microtime(true);
for ($i = 0; $i < $iterations; $i++) {
$sql = sprintf('DELETE FROM `%s`,`%s`,`%s`,`%s`,`%s`,`%s`,`%s`,`%s`,`%s`,`%s` WHERE `ID` = ?', $table, $table, $table, $table, $table, $table, $table, $table, $table, $table);
}
echo 'many sprintf,',(microtime(true) - $time)."\n";
$time = microtime(true);
for ($i = 0; $i < $iterations; $i++) {
$sql = 'DELETE FROM `' . $table . '`,`' . $table . '`,`' . $table . '`,`' . $table . '`,`' . $table . '`,`' . $table . '`,`' . $table . '`,`' . $table . '`,`' . $table . '`,`' . $table . '` WHERE `ID` = ?';
}
echo 'many concat,',(microtime(true) - $time)."\n";
$time = microtime(true);
for ($i = 0; $i < $iterations; $i++) {
$sql = "DELETE FROM `$table`,`$table`,`$table`,`$table`,`$table`,`$table`,`$table`,`$table`,`$table`,`$table` WHERE `ID` = ?";
}
echo 'many "$str",',(microtime(true) - $time)."\n";
Results:
many sprintf,2.0778489112854
many concats,1.535336971283
many "$str",1.0247709751129 <-- winner
As conclusion, it becomes obvious that single concat via dot (.) char is the fastest. And for cases, when you've got many concats, the best performing method is using direct string injection via "injection: $inject" syntax.
Unless its really large amount of text it really doesnt matter.
As the others said, $str1 . $str2 is perfectly OK in most cases, except in (big) loops.
Note that you overlook some solutions:
$output = "$str1$str2";
and for large number of strings, you can put them in an array, and use implode() to get a string out of them.
Oh, and "adding strings" sounds bad, or at least ambiguous. In most languages, we prefer to speak of string concatenation.
It doesn't matter unless used in a looong loop. In usual cases focus on code readability, even if you lost several processor cycles.
Example 1 and 2 are similar, I don't think there should be much difference, this would be the fastes of all. No. 1 might be slightly faster.
Example 3 will be slower, as sprintf format ('%s %s') needs to be parsed.
Example 4 does the replace, which involves searching within a string - additional thing to do, takes more time.
But firstly, is concatenating strings a performance problem? It's very unlikely, you should profile code to measure how much time does it take to run it. Then, replace the concatenating method with a different one and time again.
If you identify it as a problem, try googling for php string builder class (there are some to be found) or write your own.
Found this post from Google, and thought I'd run some benchmarks, as I was curious what the result would be. (Benchmarked over 10,000 iterations using a benchmarker that subtracts its own overhead.)
Which 2 strings 10 strings 50 strings
----------------------------------------------------------------
$a[] then implode() 2728.20 ps 6.02 μs 22.73 μs
$a . $a . $a 496.44 ps 1.48 μs 7.00 μs
$b .= $a 421.40 ps ★ 1.26 μs 5.56 μs
ob_start() and echo $a 2278.16 ps 3.08 μs 8.07 μs
"$a$a$a" 482.87 ps 1.21 μs ★ 4.94 μs ★
sprintf() 1543.26 ps 3.21 μs 12.08 μs
So there's not much in it. Probably good to avoid sprintf() and implode() if you need something to be screaming fast, but there's not much difference between all the usual methods.
there are 3 types of string joining operations.
Concatenate, take 2 string, allocate memory size length1+length2 and copy each into the new memory. quickest for 2 strings. However, concatenating 10 strings then requires 9 concat operations. The memory used is the 1st string 10 times, 2nd string 10 times, 3rd string 9 times, 4th string 8 times, etc. Runs X+1 +(X-1)*2 operations using more memory each cycle.
sprintf (array_merge, join, etc), take all the strings together, sum their length, allocate a new string of size sum, then copy each string into its respective place. memory used is 2*length of all initial strings, and operations is 2*X (each length, each copy)
ob (output buffer) allocate a generic 4k chunk and copies each string to it. memory 4k + each initial string, operations = 2 + X. (start, end, each copy)
Pick your poison.
OB is like using a memory atom bomb to join 2 small strings, but is very effective when there are many joins, loops, conditions or the additions are too dynamic for a clean sprintf.
concat is the most efficient to join a few fixed strings,
sprintf which works better for building a string out of fixed values at one time.
I don't know which routine php uses in this situation: "$x $y $z", might just be reduced to an inline $x . " " . $y . " " . $z
The advice you have read may have been related to the echo function, for which it's quicker to use commas, eg:
echo $str1, $str2;
Another approach is to build up a string in a variable (eg using the . operator) then echo the whole string at the end.
You could test this yourself using the microtime function (you'll need to make a loop that repeats eg 1,000 or 100,000 times to make the numbers significant). But of the four you posted, the first one is likely to be the fastest. It's also the most readable - the others don't really make sense programmatically.
This is not a solution for 2 strings, but when you thinking of joining more strings best way like that:
$tmp=srray();
for(;;) $tmp[]='some string';
$str=implode('',$tmp);
It's faster to create array element and join them all at once, than join them hundred times.
I'm not a PHP guru, however, in many other languages (e.g. Python), the fastest way to build a long string out of many smaller strings is to append the strings you want to concatenate to a list, and then to join them using a built-in join method. For example:
$result = array();
array_push("Hello,");
array_push("my");
array_push("name");
array_push("is");
array_push("John");
array_push("Doe.");
$my_string = join(" ", $result);
If you are building a huge string in a tight loop, the fastest way to do it is by appending to the array and then joining the array at the end.
Note: This entire discussion hinges on the performance of an array_push. You need to be appending your strings to a list in order for this to be effective on very large strings. Because of my limited exposure to php, I'm not sure if such a structure is available or whether php's array is fast at appending new elements.
For nearly 2 years after last post in this thread, I think that below solution may be the fastest for huge number of tight loops:
ob_start();
echo $str1;
echo $str2;
.
.
.
echo $str_n;
$finalstr = ob_get_clean();
This method ensures a flat storage of all strings and no processing or concatenation overhead. With the final code line, you get entire buffer as well. You can safely run loops instead of independent echos.

Categories