PHP echo performance - php

I'm always using an output variable in PHP where I gather all the content before I echo it. Then I read somewhere (I don't remember where, though) that you get best performance if you split the output variable into packets and then echo each packet instead of the whole output variable.
How is it really?

If you are outputting really big strings with echo, it is better to use multiple echo statements.
This is because of the way Nagle's algorithm causes data to be buffered over TCP/IP.
Found an note on Php-bugs about it:
http://bugs.php.net/bug.php?id=18029

This will automatically break up big strings into smaller chunks and echo them out:
function echobig($string, $bufferSize = 8192) {
$splitString = str_split($string, $bufferSize);
foreach($splitString as $chunk) {
echo $chunk;
}
}
Source: http://wonko.com/post/seeing_poor_performance_using_phps_echo_statement_heres_why

I think a better solution is presented here ....
http://wonko.com/post/seeing_poor_performance_using_phps_echo_statement_heres_why#comment-5606
........
Guys, I think I narrowed it down even further!
As previously said, PHP buffering will let PHP race to the end of your script, but after than it will still “hang” while trying to pass all that data to Apache.
Now I was able, not only to measure this (see previous comment) but to actually eliminate the waiting period inside of PHP. I did that by increasing Apache’s SendBuffer with the SendBufferSize directive.
This pushes the data out of PHP faster. I guess the next step would be to get it out of Apache faster but I’m not sure if there is actually another configurable layer between Apache and the raw network bandwidth.

This is my version of the solution, it echos only if the connection is not aborted. if the user disconnects then the function exits.
<?php
function _echo(){
$args = func_get_args();
foreach($args as $arg){
$strs = str_split($arg, 8192);
foreach($strs as $str){
if(connection_aborted()){
break 2;
}
echo $str;
}
}
}
_echo('this','and that','and more','and even more');
_echo('or just a big long text here, make it as long as you want');

Related

Echoing large string in PHP results in no output at all

I am helping to build a Joomla site (using Joomla 1.5.26). One of the pages are really really big. As a result, PHP just stops working without any error and all previously printed strings are ignored. There is no output at all. We have display_errors set to TRUE and error_reporting set to E_ALL.
I found the exact line where PHP breaks. It's in libraries/joomla/application/component/view.php:196
function display($tpl = null)
{
$result = $this->loadTemplate($tpl);
if (JError::isError($result)) {
return $result;
}
echo $result;
}
Some information:
Replacing echo $result; with echo strlen($result); works. The length of the string is 257759.
echo substr($result, 0, 103396); is printing partial content.
echo substr($result, 0, 103397); results in no output at all.
echo substr($result, 0, 103396) . "A"; results in no output at all. So splitting string into chunks is not a solution.
I have checked server performance during the execution of the script. CPU usage is 100% but there's plenty of memory left. PHP memory limit is 1024M. output_buffering is 4096 but I tried setting it to unreasonably high number - dies at exact same position. Server runs Apache 2.2.14-5ubuntu8.10 and PHP 5.3.2-1ubuntu4.18. PHP runs as fast_cgi module.
I have never experienced something like that and Google search results in nothing also. Have any of you experienced something like that and know the solution?
Thanks for reading!
Maybe try exploding the string and looping through each line.
You could also try this, found on php.net - echo:
<?php
function echobig($string, $bufferSize = 8192)
{
// suggest doing a test for Integer & positive bufferSize
for ($chars = strlen($string)-1, $start = 0;$start <= $chars; $start += $bufferSize) {
echo substr($string, $start, $bufferSize);
}
}
?>
Basically, it seems echo can't handle such large data in one call. Breaking it up somehow should get you where you need to go.
what about try using print_r rather than echo
function display($tpl = null)
{
$result = $this->loadTemplate($tpl);
if (JError::isError($result)) {
return $result;
}
print_r($result);
}
I have tested this on the CLI and it works fine with PHP 5.4.11 and 5.3.15:
$str = '';
for ($i=0;$i<257759;$i++) {
$str .= 'a';
}
echo $str;
It seems a reasonable assumption that PHP itself works fine, but that the output buffer is too large for Apache/fast_cgi. I would investigate the Apache config further. Do you have any special Apache settings?
May be that?
Try something like this
php_flag output_buffering On
Or try to turn on gzip in Joomla!
Or use nginx as reverse proxy or standalone server :^ )
It seems I solved the problem by myself. It was somewhat unexpected thing - faulty HTML formatting. We use a template for order page and inside there is a loop which shows all ordered products. When there were a few products, everything worked great but when I tried to do the same with 40 products, the page did break.
However I still don't understand why the server response would be empty with status code 200.
Thanks for answers, everybody!

How to overwrite php memory for security reason?

I am actually working on a security script and it seems that I meet a problem with PHP and the way PHP uses memory.
my.php:
<?php
// Display current PID
echo 'pid= ', posix_getpid(), PHP_EOL;
// The user type a very secret key
echo 'Fill secret: ';
$my_secret_key = trim(fgets(STDIN));
// 'Destroty' the secret key
unset($my_secret_key);
// Wait for something
echo 'waiting...';
sleep(60);
And now I run the script:
php my.php
pid= 1402
Fill secret: AZERTY <= User input
waiting...
Before the script end (while sleeping), I generate a core file sending SIGSEV signal to the script
kill -11 1402
I inspect the corefile:
strings core | less
Here is an extract of the result:
...
fjssdd
sleep
STDIN
AZERTY <==== this is the secret key
zergdf
...
I understand that the memory is just released with the unset and not 'destroyed'. The data are not really removed (a call to the free() function)
So if someone dumps the memory of the process, even after the script execution, he could read $my_secret_key (until the memory space will be overwritten by another process)
Is there a way to overwrite this memory segment of the full memory space after the PHP script execution?
Thanks to all for your comments.
I already now how memory is managed by the system.
Even if PHP doesn't use malloc and free (but some edited versions like emalloc or efree), it seems (and I understand why) it is simply impossible for PHP to 'trash' after freeing disallowed memory.
The question was more by curiosity, and every comments seems to confirm what I previously intend to do: write a little piece of code in a memory aware language (c?) to handle this special part by allocating a simple string with malloc, overwriting with XXXXXX after using THEN freeing.
Thanks to all
J
You seem to be lacking a lot of understanding about how memory management works in general, and specifically within PHP.
A discussion of the various salient points is redundant when you consider what the security risk is here:
So if someone dumps the memory of the process, even after the script execution
If someone can access the memory of a program running under a different uid then they have root access and can compromise the target in so many other ways - and it doesn't matter if it's PHP script, ssh, an Oracle DBMS....
If someone can access the memory previously occupied by a process which has now terminated, then not only have they got root, they've already compromised the kernel.
You seem to have missed an important lesson in what computers mean by "delete operations".
See, it's never feasible for computer to zero-out memory, but instead they just "forget" they were using that memory.
In other words, if you want to clear memory, you most definitely need to overwrite it, just as #hakre suggested.
That said, I hardly see the point of your script. PHP just isn't made for the sort of thing you are doing. You're probably better off with a small dedicated solution rather than using PHP. But this is just my opinion. I think.
I dunno if that works, but if you can in your tests, please add these lines to see the outcome:
...
// Overwrite it:
echo 'Overwrite secret: ';
for($l = strlen($my_secret_key), $i = 0; $i < $l; $i++)
{
$my_secret_key[$i] = '#';
}
And I wonder whether or not running
gc_collect_cycles();
makes a difference. Even the values are free'ed, they might still be in memory (of the scripts pid or even somewhere else in memory space).
I would try whether overwriting memory with some data would eventually erase your original locations of variables:
$buffer = '';
for ($i = 0; $i < 1e6; $i++) {
$buffer .= "\x00";
}
As soon as php releases the memory, I suppose more allocations might be given the same location. It's hardly fail proof though.

PHP Loop - expression/function causing serious delay

I was wondering if anybody could shed any light on this problem.. PHP 5.3.0 :)
I have a loop, which is grabbing the contents of a CSV file (large, 200mb), handling the data, building a stack of variables for mysql inserts and once the loop is complete and the variables created, I'm inserting the information.
Now firstly, the mysql insert is performing perfectly, no delays and all is fine, however it's the LOOP itself that has the delay, I was originally using fgetcsv() to read the CSV file but compared to file_get_contents() this had a seriously delay - so I switched to file_get_contents(). The loop will perform in a matter of seconds, until I attempt to add a function (I've also added the expression inside the loop without the function to see if it helps) to create an array with the CSV data from each line, this is what is causing serious delays on the parsing time! (the difference is about 30 seconds based on this 200mb file but depending of filesize of csv file I guess)
Here's some code so you can see what I'm doing:
$filename = "file.csv";
$content = file_get_contents($filename);
$rows = explode("\n", $content);
foreach ($rows as $data) {
$data = preg_replace("/^\"(.*)\"$/","$1",preg_split("/,(?=(?:[^\"]*\"[^\"]*\")*(?![^\"]*\"))/", trim($data))); //THIS IS THE CULPRIT CAUSING SLOW LOADING?!?
}
Running the above loop, will perform almost instantly without the line:
$data = preg_replace("/^\"(.*)\"$/","$1",preg_split("/,(?=(?:[^\"]*\"[^\"]*\")*(?![^\"]*\"))/", trim($data)));
I've also tried creating a function as below (outside of loop):
function csv_string_to_array($str) {
$expr="/,(?=(?:[^\"]*\"[^\"]*\")*(?![^\"]*\"))/";
$results=preg_split($expr,trim($str));
return preg_replace("/^\"(.*)\"$/","$1",$results);
}
and calling the function instead of the one liner:
$data = csv_string_to_array($data);
With again no luck :(
Any help would be appreciated on this, I'm guessing the fgetcsv function is performing in a very similar way based on the delay it causes, looping through and creating an array from the line of data.
Danny
The regex subexpressions (bounded by "(...)") are the issue. It's trivial to show that adding these to an expression can greatly reduce its performance. The first thing I would try is to stop using preg_replace() to simply remove leading and trailing double quotes (trim() would be a better bet for that) and see how much that helps. After that you might need to try a non-regex way to parse the line.
I partially found a solution, I'm sending a batch to only loop 1000 lines at a time (php is looping by 1000 until it reaches the end of the file).
I'm then only setting:
$data = preg_replace("/^\"(.*)\"$/","$1",preg_split("/,(?=(?:[^\"]*\"[^\"]*\")*(?![^\"]*\"))/", trim($data)));
on the 1000 lines, so that it's not being set for the WHOLE file which was causing issues.
It is now looping and inserting 1000 rows into the mysql database in 1-2 seconds, which I'm happy with. I've setup the script to loop 1000 rows, remember its last location, then loop to the next 1000 until it reaches the end, it seems to be working ok!
I'd say the major culprit is the complexity of the preg_split() regexp.
And the explode() is probably eating some seconds.
$content = file_get_contents($filename);
$rows = explode("\n", $content);
could be replaced by:
$rows = file ($filename); // returns an array
But, I second the above suggestion from ITroubs, fgetcsv() would probably be a much better solution.
I would suggest using fgetcsv for parsing the data. It seems like memory may be your biggest impact. So to avoid consuming 200MB of RAM, you should parse line-by-line as follows:
$fp = fopen($input, 'r');
while (($row = fgetcsv($fp, 0, ',', '"')) !== false) {
$out = '"' . implode($row, '", "') . '"'; // quoted, comma-delimited output
// perform work
}
Alternatively: Using conditionals in preg is typically very expensive. It is can sometimes be faster to process these lines using explode() and trim() with its $charlist parameter.
The other alternative, if you still want to use preg, add the S modifier to try to speed up the expression.
http://www.php.net/manual/en/reference.pcre.pattern.modifiers.php
S
When a pattern is going to be used several times, it is worth spending more time analyzing it in order to speed up the time taken for matching. If this modifier is set, then this extra analysis is performed. At present, studying a pattern is useful only for non-anchored patterns that do not have a single fixed starting character.
By the way, I don't think your function is doing what you think it should: it won't actually modify the $rows array when you've exited from the loop. To do that, you need something more like:
foreach ($rows as $key => $data) {
$rows[$key]=preg_replace("/^\"(.*)\"$/","$1",preg_split("/,(?=(?:[^\"]*\"[^\"]*\")*(?![^\"]*\"))/", trim($data)));

Blocking file-read in php?

I want to read everything from a textfile and echo it. But there might be more lines written to the text-file while I'm reading so I don't want the script to exit when it has reached the end of the file, instead I wan't it to wait forever for more lines. Is this possible in php?
this is just a guess, but try to pass through (passthru) a "tail -f" output.
but you will need to find a way to flush() your buffer.
IMHO a much nicer solution would be to build a ajax site.
read the contents of the file in to an array. store the number of lines in the session. print the content of the file.
start an ajax request every x seconds to a script which checks the file, if the line count is greater then the session count append the result to the page.
you could use popen() inststed:
$f = popen("tail -f /where/ever/your/file/is 2>&1", 'r');
while(!feof($f)) {
$buffer = fgets($f);
echo "$buffer\n";
flush();
sleep(1);
}
pclose($f)
the sleep is important, without it you will have 100% CPU time.
In fact, when you "echo" it, it goes to the buffer. So what you want is "appending" the new content if it's added while the browser is still receiving output. And this is not possible (but there are some approaches to this).
I solved it.
The trick was to use fopen and when eof is reached move the cursor to the previous position and continue reading from there.
<?php
$handle = fopen('text.txt', 'r');
$lastpos = 0;
while(true){
if (!feof($handle)){
echo fread($handle,8192);
flush();
$lastpos = ftell($handle);
}else{
fseek($handle,$lastpos);
}
}
?>
Still consumes pretty much cpu though, don't know how to solve that.
You may also use filemtime: you get latest modification timestamp, send the output and at the end compare again the stored filemtime with the current one.
Anyway, if you want the script go at the same time that the browser (or client), you should send the output using chunks (fread, flush), then check any changes at the end. If there are any changes, re-open the file and read from the latest position (you can get the position outside of the loop of while(!feof())).

This loop won't echo my string

I have an infinite loop that I want to sleep for a second but the browser just hangs and nothing echos. Can someone tell me why?
while(1)
{
sleep(1);
echo 'test'.'<br>';
}
It won't work until the server stops processing.
You could try flushing the output buffer with flush(), but it's not guaranteed to work.
Try this function from PHP.net:
function flush_buffers() {
ob_end_flush();
ob_flush();
flush();
ob_start();
}
That's because PHP sends data to browser in big chunks. It is all about optimization, sending small data is bad for performance, you want it to be sent in huge enough blocks so overhead from transferring it (both in speed and actual traffic) would be relatively low. Couple of small strings is just not big enough. Try add a flush() right after the echo, it will force PHP send this string to the browser.
Couse it's a infinite loop it never will send response or be break

Categories