PHPUnit code coverage generation causing memory exhaustion - php

I'm currently using Zend Framework in conjunction with PHPUnit to conduct unit testing on an application. When Hudson executes the PHPUnit shell command, the maximum PHP memory limit is reached sometime during code coverage generation. I currently have a total of 41 tests with 334 assertions.
I have successfully eliminated this error by raising the memory_limit setting to 768M using the -d memory_limit=768M switch; however, I am worried that as the complexity increases along with the total number of tests/assertions, I will not have enough memory to generate the HTML for code coverage statistics.
OS: CentOS 5.5
Control Panel: WHM/cPanel
CI Server: Hudson
/usr/local/bin/phpunit
--verbose
-d memory_limit=512M
--log-junit ../../build/logs/phpunit.xml
--coverage-clover ../../build/logs/coverage/clover.xml
--coverage-html ../../build/logs/coverage-html/
Fatal error: Allowed memory size of
536870912 bytes exhausted
Before committing my changes and letting Hudson handle the rest, I use Windows 7 for development. The memory usage never exceeded 340MB while running the same command within W7.

By reducing the number of files included within code coverage, as well as increasing the overall memory limit in PHP, I was able to basically kill this error. The entire Zend Framework was being included within code coverage, which is very large.

Do you have xdebug profiling enabled, if so try disabling it. I've experienced this problem before, and it came down to extensions in php (specifically xdebug profiling and/or Inclued heirarchy viewer)

As of 2019, you can use the PCOV driver with PHPUnit to generate your code coverage report. In my experience, it's only marginally less performant than running a plain PHPUnit suite.
Read Speed up PHPUnit Code Coverage Analysis for some good benchmark comparing XDebug, PHPDebug and PCOV. It also has instructions on how to enable PCOV on PHPUnit 8.
Read Setup PHP PCOV for 5 times faster PHPUnit code coverage for instructions on setting up PCOV on PHPUnit7 and below.

Related

How to make phpunit faster on big bootstrap

I've tried opcache but it even makes it lower, in some cli programs there is a watch option, the cli keeps running that on changes it will redo something, I couldn't find that for phpunit.
Just to be clear, the tests are pretty fast, what's slow is the bootstrap because the codebase is rather big, the code base runs fine on a webserver thanks to opcache, but for phpunit opcache (which only should work with file cache I'd suppose?) seems to be hopeless...
I'm using PHP 8 and PHPUnit 9

How can I speed up tests with PHPUnit running with Xdebug

I'm running test on PHPUnit using Xdebug for generating coverage and it is very slow.
I tried using PHPdbg but that leads to memory errors.
I was told that I can create a filter file, and that should help. Can anyone explain to me how that would work.
The documentation for PHPUnit has information on speeding up code coverage with Xdebug. For more background information I recommend this article.
That being said, I recommend having a look at PCOV for even faster code coverage data collection for PHP and PHPUnit.

How can i improve Codeception Code Coverage speed

Currently we have written some unit test for our php laravel 5.5 application using codeception. just for additional info, our laravel code base have around 200k LOC. For normal unit test run, we noticed that it is really fast in which we can get more than 200 tests to finish within 1 hour.
But the main issue is that when we enabled codecoverage in codeception which is using xdebug by default, we noticed the execution time increased drastically.
Now it already took 1 week but the whole codecoverage execution not even finished yet.
I am not sure whether this is the problem from codeception or xdebug itself but if anybody have experiences running php codecoverage on a huge codebase, it would be nice if you can share how you achieve it. Would appreciate it also if somebody can suggest any other tools to look into. Currently we are considering switching to phpunit but are still open to other tools to explore.
Replacing Codeception with PHPUnit will be a lot of work for little gain, because Codeception uses PHPUnit and its PHP-Code-Coverage library under the hood.
There is a new code coverage extension, called pcov which is supposedly much faster than xdebug.
https://github.com/krakjoe/pcov/blob/develop/INSTALL.md
I haven't tried to use it, but be aware that it requires PHPUnit 8, which is only available on PHP 7.2 or later versions.
Recently I have seen code coverage sped up by replacing xdebug with phpdbg - I can't give exact numbers as the code base has extensive functional tests in its test run (and the speed-up was only for unit tests), but a 2+ hour test and coverage run has been reduced to around 50 minutes.
Note that xdebug and phpdbg can differ in their code coverage (it looked like xdebug better dealt with opcache optimisations).
edit:
Since replacing xdebug with phpdbg, I have seen further speed improvements by replacing phpdbg with pcov.

Php ZipArchive's open/addFile method crashes with fatal error on big datasets

We have a php (Version 5.3.10) cli application doing some heavy work on a ubuntu 12.04 64 bit machine. This script can run for a long time depending on the dataset it receives. These datasets are zip files with a lot of XML, image and MS doc files.
Earlier this script used few system commands (shell, perl, java) to complete its task. We did not have problems then. Recently, we upgraded these scripts to use RabbitMQ for multiple concurrent invocations, moved from cron based working to supervisord for automatic recovery and monitoring, and also used php's core libraries and functions as much as possible to avoid shell invocations.
Now, after deploying to production, we found that the script fatally crashed on a line, where ZipArchive was used to create an archive. To be specific, only on its methods "open" and "addFile". We tested this many a times with the problematic dataset and found that this is where the real problem is.
The error thrown was "Fatal Error: Maximum execution time of 300 seconds exceeded". We know about php's limit on exection time and we double checked php.ini and all those settings under "/etc/php5/conf.d" folder, and everywhere we had "max_execution_time" set to 0. We also checked that the script's sapi mode was "cli" using "php_sapi_name()". ini_get("max_execution_time") also returns 0.
Even when the script is managed by supervisord, the above mode and execution limit are the same. We could not find out from where this "max_execution_time" limit of 300 seconds is being triggered.
One more thing, the script actually ran for more than 600 seconds when it crashed with this message. We also feel that, its only when ZipArchive took more than 300 seconds by itself, that this happens. But we are not sure. Also the partial zip archive it creates when this happens is between 280 MB and 290 MB. So we downloaded php source from its repository and did a quick grep to see if ZipArchive's code base had any such limits. We found none.
We are now trying to replace ZipArchive php code with shell command as a work around. We are yet to test it. I will post our findings here soon.
Had any of you faced such issues before? Is this something related to ZipArchive? Is it recommended to use ZipArchive for creating huge archives? The partial zip file it created before being crashed was between 280 MB and 290 MB.
I had the same problem once when using zipArchive with files > 500 MB. In some cases it also acts up when the size is considerably smaller, but the number of files are higher. Finally I ended up creating a wrapper over the linux zip/unzip commands and used them so that in the core it is basically just doing an exec() on the OS level. Never had problems with that. Course you need a sysad to set up permissions and all, but its a stable soln.

PHP memory profiling

What's a good way to profile a PHP page's memory usage? For example, to see how much memory my data is using, and/or which function calls are allocating the most memory.
xdebug doesn't seem to provide memory information in its profiling feature.
xdebug does provide it in its tracing feature. This is pretty close to what I want, except the sheer amount of data is overwhelming, since it shows memory deltas for every single function call. If it were possible to hide calls below a certain depth, maybe with some GUI tool, that would solve my problem.
Is there anything else?
As you probably know, Xdebug dropped the memory profiling support since the 2.* version. Please search for the "removed functions" string here: http://www.xdebug.org/updates.php
Removed functions
Removed support for Memory profiling as that didn't work properly.
So I've tried another tool and it worked well for me.
https://github.com/arnaud-lb/php-memory-profiler
This is what I've done on my Ubuntu server to enable it:
sudo apt-get install libjudy-dev libjudydebian1
sudo pecl install memprof
echo "extension=memprof.so" > /etc/php5/mods-available/memprof.ini
sudo php5enmod memprof
service apache2 restart
And then in my code:
<?php
memprof_enable();
// do your stuff
memprof_dump_callgrind(fopen("/tmp/callgrind.out", "w"));
Finally open the callgrind.out file with KCachegrind
Using Google gperftools (recommended!)
First of all install the Google gperftools by downloading the latest package here: https://code.google.com/p/gperftools/
Then as always:
sudo apt-get update
sudo apt-get install libunwind-dev -y
./configure
make
make install
Now in your code:
memprof_enable();
// do your magic
memprof_dump_pprof(fopen("/tmp/profile.heap", "w"));
Then open your terminal and launch:
pprof --web /tmp/profile.heap
pprof will create a new window in your existing browser session with something like shown below:
Xhprof + Xhgui (the best in my opinion to profile both cpu and memory)
With Xhprof and Xhgui you can profile the cpu usage as well or just the memory usage if that's your issue at the moment.
It's a very complete solutions, it gives you full control and the logs can be written both on mongo or in the filesystem.
For more details see my answer here.
Blackfire
Blackfire is a PHP profiler by SensioLabs, the Symfony2 guys https://blackfire.io/
If you use puphpet to set up your virtual machine you'll be happy to know it's supported ;-)
Well, this may not be exactly what you're looking for, but PHP does have a couple of functions built-in that will output memory usage. If you just wanted to see how much memory a function call is using, you could use memory_get_peak_usage() before and after a call, and take the difference.
You use the same technique around your data using the very similar memory_get_usage().
Pretty unsophisticated approach, but it's a quick way to check out a piece of code. I agree that xdebug mem deltas can be too verbose to be useful sometimes, so I often just use it to narrow down to a section of code, then dump out specific memory usage for small pieces manually.
Xdebug reimplemented memory tracing in 2.6 (2018-01-29) which can be used in Qcachegrind or similar tool. Just make sure to select the memory option :)
From the docs:
Since Xdebug 2.6, the profiler also collects information about how much memory is being used, and which functions aGnd methods increased memory usage.
I'm not familiar with the format of the file, but it's Qcachegrind has worked great for me in tracing a couple memory issues.
http://geek.michaelgrace.org/2012/04/tracing-php-memory-usage-using-xdebug-and-mamp-on-mac/
I'm on a Mac so if you're on Windows you'll have to test this, but this works for me.
I modified my tracefile-analyzer.php file and added the path to the PHP binary at the top so that you could call it in terminal as a normal unix script.
#!/Applications/MAMP/bin/php5.3/bin/php
<?php
if ( $argc <= 1 || $argc > 4 )
{
Don't forget to chmod this file to 755.
You could easily create a ruby watchr script to automatically call the script each time it creates a memory profile file (*.xt). That way you could keep testing and seeing your improvements without having to execute the command over and over.

Categories