Does this PHP code crash Apache for anyone else? - php

preg_match_all('/(a)*/', str_repeat('a', 1000), $matches);
(edit: change the regexp a bit to make it simpler while still crashing)
I ran it on PHP 5.3.5 with Apache 2.0 and it crashes the server. If I change 339 to 338 it doesn't crash anymore, so it seems like a bug to me. I tried reporting it to http://bugs.php.net/ but it's down. Is this a PHP bug? Does it crash for anyone else?
Edit: Changing the code to
preg_match_all('/(?:a)*/', str_repeat('a', 339), $matches);
allows for a longer string before crashing.
If it doesn't crash, try increasing the string length by a factor of 10 or 100 as it may be a memory issue and you may have more memory.
Edit 2: the crash is a complete process crash, on Windows 7 I get the "End task" message instantaneously after execution.
Edit 3: if the crash is due to too much backtracing, and the above example clearly can cause it, the following should not:
preg_match('/[^"\']*(;|$)/',
str_repeat('x', 1000), $matches);
This is my actual code that's crashing. It's simply meant to split multiple SQL queries by ;, while allowing ; inside single or double quotes. Why is this causing so much backtracing, and how can I fix it?

The problem isn't memory or execution time...
ini_set('pcre.backtrack_limit', 10000);
Feel free to decrease the 10000 to a different value if necessary. For more information, see http://php.net/manual/en/pcre.configuration.php.
If you feel like testing where it crashes:
<?php
ini_set('pcre.backtrack_limit', 10000);
for ( $i = 1; $i < 65535; $i++ )
{
echo $i . PHP_EOL;
preg_match_all('/(a)*/', str_repeat('a', $i), $matches);
}
?>

Related

Echoing large string in PHP results in no output at all

I am helping to build a Joomla site (using Joomla 1.5.26). One of the pages are really really big. As a result, PHP just stops working without any error and all previously printed strings are ignored. There is no output at all. We have display_errors set to TRUE and error_reporting set to E_ALL.
I found the exact line where PHP breaks. It's in libraries/joomla/application/component/view.php:196
function display($tpl = null)
{
$result = $this->loadTemplate($tpl);
if (JError::isError($result)) {
return $result;
}
echo $result;
}
Some information:
Replacing echo $result; with echo strlen($result); works. The length of the string is 257759.
echo substr($result, 0, 103396); is printing partial content.
echo substr($result, 0, 103397); results in no output at all.
echo substr($result, 0, 103396) . "A"; results in no output at all. So splitting string into chunks is not a solution.
I have checked server performance during the execution of the script. CPU usage is 100% but there's plenty of memory left. PHP memory limit is 1024M. output_buffering is 4096 but I tried setting it to unreasonably high number - dies at exact same position. Server runs Apache 2.2.14-5ubuntu8.10 and PHP 5.3.2-1ubuntu4.18. PHP runs as fast_cgi module.
I have never experienced something like that and Google search results in nothing also. Have any of you experienced something like that and know the solution?
Thanks for reading!
Maybe try exploding the string and looping through each line.
You could also try this, found on php.net - echo:
<?php
function echobig($string, $bufferSize = 8192)
{
// suggest doing a test for Integer & positive bufferSize
for ($chars = strlen($string)-1, $start = 0;$start <= $chars; $start += $bufferSize) {
echo substr($string, $start, $bufferSize);
}
}
?>
Basically, it seems echo can't handle such large data in one call. Breaking it up somehow should get you where you need to go.
what about try using print_r rather than echo
function display($tpl = null)
{
$result = $this->loadTemplate($tpl);
if (JError::isError($result)) {
return $result;
}
print_r($result);
}
I have tested this on the CLI and it works fine with PHP 5.4.11 and 5.3.15:
$str = '';
for ($i=0;$i<257759;$i++) {
$str .= 'a';
}
echo $str;
It seems a reasonable assumption that PHP itself works fine, but that the output buffer is too large for Apache/fast_cgi. I would investigate the Apache config further. Do you have any special Apache settings?
May be that?
Try something like this
php_flag output_buffering On
Or try to turn on gzip in Joomla!
Or use nginx as reverse proxy or standalone server :^ )
It seems I solved the problem by myself. It was somewhat unexpected thing - faulty HTML formatting. We use a template for order page and inside there is a loop which shows all ordered products. When there were a few products, everything worked great but when I tried to do the same with 40 products, the page did break.
However I still don't understand why the server response would be empty with status code 200.
Thanks for answers, everybody!

PHP Output Buffer "Millenium" Bug?

I faced a strange issue today.
For several months I used buffer flushing in PHP to send small string sizes to the client without problems.
Today I returned to the project and it turned out that my server won't send strings smaller than 512 bytes.
Here is my code:
<?php
echo "length:".$myUpcomingStringSize;
ob_flush();
flush();
sleep(1);
for($i = 0; $i < count($allLines); $++) {
echo $allLines[$i];
ob_flush();
flush();
}
?>
This code worked like charm the whole last year. And now it doesn't anymore. I played around a bit and added some random characters. As the string size gets equal or greater 512, the server sends the buffer content.
Can anybody imagine the issue I have to solve here? Anyone else facing this issue? Or does
someone know how to configure this minimum packet size?
If you changed neither the program nor the server, you should assume that the program never worked as intended. Especially Windows systems are known to buffer the output until a certain number of Bytes is in the output buffer. This buffering is at system-level and thus can not be affected by any PHP configuration.
If you know that 512 Bytes is the minimum required for the output buffer to send, then you could use something like
define('MIN_OUTPUT_LENGTH', 512);
echo str_pad("length: $myUpcomingStringSize", MIN_OUTPUT_LENGTH, "\0"), '\n';
// (If you run into trouble with the null-bytes, use space character instead)
Notes
If you do not use "userspace" output buffering, then ob_flush(); is redundant.
If there is no delay in your for loop, then flushing between lines is not a good idea. Especially for mobile applications where the network tries to pack as much data as possible into a single packet.
There is a syntax error in your for loop header (The expression $++ is missing a variable identifier, probably i)

How to overwrite php memory for security reason?

I am actually working on a security script and it seems that I meet a problem with PHP and the way PHP uses memory.
my.php:
<?php
// Display current PID
echo 'pid= ', posix_getpid(), PHP_EOL;
// The user type a very secret key
echo 'Fill secret: ';
$my_secret_key = trim(fgets(STDIN));
// 'Destroty' the secret key
unset($my_secret_key);
// Wait for something
echo 'waiting...';
sleep(60);
And now I run the script:
php my.php
pid= 1402
Fill secret: AZERTY <= User input
waiting...
Before the script end (while sleeping), I generate a core file sending SIGSEV signal to the script
kill -11 1402
I inspect the corefile:
strings core | less
Here is an extract of the result:
...
fjssdd
sleep
STDIN
AZERTY <==== this is the secret key
zergdf
...
I understand that the memory is just released with the unset and not 'destroyed'. The data are not really removed (a call to the free() function)
So if someone dumps the memory of the process, even after the script execution, he could read $my_secret_key (until the memory space will be overwritten by another process)
Is there a way to overwrite this memory segment of the full memory space after the PHP script execution?
Thanks to all for your comments.
I already now how memory is managed by the system.
Even if PHP doesn't use malloc and free (but some edited versions like emalloc or efree), it seems (and I understand why) it is simply impossible for PHP to 'trash' after freeing disallowed memory.
The question was more by curiosity, and every comments seems to confirm what I previously intend to do: write a little piece of code in a memory aware language (c?) to handle this special part by allocating a simple string with malloc, overwriting with XXXXXX after using THEN freeing.
Thanks to all
J
You seem to be lacking a lot of understanding about how memory management works in general, and specifically within PHP.
A discussion of the various salient points is redundant when you consider what the security risk is here:
So if someone dumps the memory of the process, even after the script execution
If someone can access the memory of a program running under a different uid then they have root access and can compromise the target in so many other ways - and it doesn't matter if it's PHP script, ssh, an Oracle DBMS....
If someone can access the memory previously occupied by a process which has now terminated, then not only have they got root, they've already compromised the kernel.
You seem to have missed an important lesson in what computers mean by "delete operations".
See, it's never feasible for computer to zero-out memory, but instead they just "forget" they were using that memory.
In other words, if you want to clear memory, you most definitely need to overwrite it, just as #hakre suggested.
That said, I hardly see the point of your script. PHP just isn't made for the sort of thing you are doing. You're probably better off with a small dedicated solution rather than using PHP. But this is just my opinion. I think.
I dunno if that works, but if you can in your tests, please add these lines to see the outcome:
...
// Overwrite it:
echo 'Overwrite secret: ';
for($l = strlen($my_secret_key), $i = 0; $i < $l; $i++)
{
$my_secret_key[$i] = '#';
}
And I wonder whether or not running
gc_collect_cycles();
makes a difference. Even the values are free'ed, they might still be in memory (of the scripts pid or even somewhere else in memory space).
I would try whether overwriting memory with some data would eventually erase your original locations of variables:
$buffer = '';
for ($i = 0; $i < 1e6; $i++) {
$buffer .= "\x00";
}
As soon as php releases the memory, I suppose more allocations might be given the same location. It's hardly fail proof though.

memory_get_usage

I'm making a little benchmark class to display page load time and memory usage.
Load time is already working, but when I display the memory usage, it doesn't change
Example:
$conns = array();
ob_start();
benchmark::start();
$conns[] = mysql_connect('localhost', 'root', '');
benchmark::stop();
ob_flush();
uses the same memory as
$conns = array();
ob_start();
benchmark::start();
for($i = 0; $i < 1000; $i++)
{
$conns[] = mysql_connect('localhost', 'root', '');
}
benchmark::stop();
ob_flush();
I'm using memory_get_usage(true) to get the memory usage in bytes.
memory_get_usage(true) will show the amount of memory allocated by the php engine, not actually used by the script. It's very possible that your test script hasn't required the engine to ask for more memory.
For a test, grab a large(ish) file and read it into memory. You should see a change then.
I've successfully used memory_get_usage(true) to track the memory usage of web crawling scripts, and it's worked fine (since the goal was to slow things down before hitting the system memory limit). The one thing to remember is that it doesn't change based on actual usage, it changes based on the memory requested by the engine. So what you end up seeing is sudden jumps instead of slowing growing (or shrinking).
If you set the real_usage flag to false, you may be able to see very small memory changes - however, this won't help you monitor the true amount of memory php is requesting from the system.
(Update: To be clear the difference I describe is between memory used by the variables of your script, compared to the memory the engine requested to run your script. All the same script, different way of measuring.)
I'm no Guru in PHP's internals, but I could imagine an echo does not affect the amount of memory used by PHP, as it just outputs something to the client.
It could be different if you enable output buffering.
The following should make a difference:
$result = null;
benchmark::start()
for($i = 0; $i < 10000; $i++)
{
$result.='test';
}
Look at:
for($i = 0; $i < 1000; $i++)
{
$conns[] = mysql_connect('localhost', 'root', '');
}
You could have looped to 100,000 and nothing would have changed, its the same connection. No resources are allocated for it because the linked list remembering them never grew. Why would it grow? There's already (assumingly) a valid handle at $conns[0]. It won't make a difference in memory_get_usage(). You did test $conns[15] to see if it worked, yes?
Can root#localhost have multiple passwords? No. Why would PHP bother to handle another connection just because you told it to? (tongue in cheek).
I suggest running the same thing via CLI through Valgrind to see the actual heap usage:
valgrind /usr/bin/php -f foo.php .. or something similar. At the bottom, you'll see what was allocated, what was freed and garbage collection at work.
Disclaimer: I do know my way around PHP internals, but I am no expert in that deliberately obfuscated maze written in C that Zend calls PHP.
echo won't change the allocated number of bytes (unless you use output buffers).
the $i-variable is being unset after the for-loop, so it's not changing the number of allocated bytes either.
try to use a output buffering example:
ob_start();
benchmark::start();
for($i = 0; $i < 10000; $i++)
{
echo 'test';
}
benchmark::stop();
ob_flush();

file_get_contents and file_put_contents with large files

I'm trying to get file contents, replace some parts of it using regular expressions and preg_replace and save it to another file:
$content = file_get_contents('file.txt', true);
$content_replaced = preg_replace('/\[\/m\]{1}\s+(\{\{.*\}\})\s+[\x{4e00}-\x{9fa5}]+/u', 'replaced text', $contents);
if ($content_replaced) {
file_put_contents('file_new.txt', $content_replaced);
echo "Successful!";
}
else {
echo "Some error ocurred";
}
this piece of code works fine with small files, but when I try the original file, which is about 60Mb, it just keeps giving me a message "Some error ocurred".
Any suggestions are greatly appreciated.
Update. No errors in the logs, memory limit is set to 1024M
I've had max/limit issues with file_put_contents.
No idea what the limits might be, but using fwrite solved my troubles and I put down the bottle.
You're probably running out of memory. What's the memory_limit set to? (phpinfo() will tell you). You may be able to increase the memory limit like:
ini_set('memory_limit','128M');
I'm pretty sure you're hitting some regex limit. Heck, some time ago I hit a limit with 1000 chars... with 60Mb of input I bet you will likely hit regex limits everywhere also with really simple patterns. I will try at least to simplify it as much as possible, making it ungreedy with .*? instead of .* if possible.
To get more information, just check the return value of preg_last_error().

Categories