Php profiling - need some advice - php

I'm pretty new to php and absolutely new to profiling, so please forgive me if my question is a little bit "simple".
So enabled xdebug and downloaded Kcachegrid to begin profiling, read through what I could find. My problem is that I'd like to improve speed, but whenever I view my xdebug reports in Kcachegrid, the same script with the same hardware runs with a merely different execution time (form 1.1 to 1.9 sec). I'm executing it on localhost so that I do not have to deal with network speed.
i've read this pretty good article from John Lim http://phplens.com/lens/php-book/optimizing-debugging-php.php and in his article he's dealing with fix execution time. I don't know what I am doing wrong or how could i get fix times so I can see if I get actually better results.
Any advice would be appriciated, or any resource you have read and found useful on php profiling. Thanks for replies!

xdebug is good but I found XHprof an extension written by Facebook for its production profiling works a lot better. It shows you the Wall times of all functions and the stack of the call and can drill down to many levels and see the wall time on each call. You can also compare and consolidate similar calls and compare them over a period of time. Have a look at the demo on XHprof and its a very simple extension to implement. Its a matter of enabling a header and footer and you are off to the races.
http://xhprof.io/
Cheers,
Thusjanthan

Related

What can delay "Time to First byte" by almost 2s?

I have enabled a PHP profiler, well magento, but it is still a profiler.
This is the standard magento compiler that records all the processing including all db queries to create the page, from receiving the request.
I am testing with a the php built-in server hosted locally.
The results show pretty decent server response times, but on the chrome developer tools the Time to first byte is much higher. Why is this?
Take Note...the screenshots below of the timings are from the SAME REQUEST...CLEARLY
Profiler (www.mysite.com/index.php):
Developer tools (www.mysite.com/index.php):
Edit
I had written a bit bigger answer but then I realized that you are using the built-in PHP server. Honestly, you should give a real webserver a go. You might be chasing a problem that does not exist (although one can argue that there is something going on because it's not a normal situation - you might be encountering a PHP bug) ;)
Found the issue.
The profiler is not recording included libraries. In the magento/lib folder nothing is recorded even if you explicitly set it to profile.
It seems that we need to use an all-encompassing php profiler

measuring php performance

I'm trying to track down issues with an application [modx] I have several of these sites [about 10] on my server & was wondering how I can see what php is doing.
Pages on these sites are extremely slow while the same sites in dev are fine as are other php applications on the server.
I tried using xdebug to get an idea of what php was doing while processing these pages & where the bottleneck was occurring, but it only appeared to want to do anything on an error [there are no errors being thrown]
Any suggestions on how to track this down?
[linux/Centos5/php5.2.?/apache2]
Xdebug and webgrind are a nice way to see where your bottel necks are...
Read XDEBUG_PROFILE and Webgrind
Set up the php.ini to have xdebug profile your code on every run or if a special param is passed, then setup webgrind to read from the same directory xdebug writes its profile dumps to.
Webgrind will show you what functions and set of functions require the most time, it breaks it down and makes it easy to find slow and/or inefficient code. (eg. your script is calling "PDOStatement->execute" 300 times on a fast query [Or calling it once and a massively slow one] taking up 90% of the execution time).
The most commonly used tool, for finding bottlenecks in PHP, would be Xdebug. But you should also manually examine the codebase.
There are three different areas where you will have to focus on:
frontend performance
SQL queries
php logic itself
.. and the impact on the perceived speed is in this order.
You should start by running ySlow, and make sure that your site follows the guidelines as closely as possible.
The next step would be tracking down what SQL queries are executed, and (assuming you are using mysql) try to run them with EXPLAIN. Also, check the queries themselves. There might be some extremely stupid code there, like ORDER BY RAND() or use of LIKE in huge tables.
And the last stage would fixing it all would a hard looks at the code itself. Both on PHP and JavaScript side of things.
Also , you should upgrade to PHP 5.3, because your version is extremely outdated.
Usually when you don't know what you're looking for, you cannot spot it with tools like xdebug or other plugins/debug bars etc built into CMS/Framework, new relic is the simplest solution - you'll be able to spot bottlenecks after few min.
while new relic is a paid app, you can test if for free for first 14 days - it's more than enough to find problem.
It's great because it integrates all other tool's and data sources you usually use:
xdebug, cpu & i/o monitoring, mysql slowlog, queries log.
It will also show you if your app is slow on php/DB/frontend/network.
You should try it out instead of wasting time for debugging with other tools.
here is a guide for centos installation: https://newrelic.com/docs/php/php-agent-installation-redhat-and-centos

Slow PHP script - automatic debug and diagnosis?

How can I find out whether a PHP script goes bad and runs really slow when ran by hundreds of users every second, and better yet, is there any tool that could tell me approximately which part of the code slows me down?
...
I don't wish to post the code here (mainly because this question refers to something else and because it's a waste of space) and preferably never post it anywhere because it's actually a mess!... a mess that I understand and yes, i coded it, but still a mess which would insult anyone trying to comprehend it... so if you have any creative ideas, please let me know!
Cheers!
( thank you already for your incoming answers! )
Enable XDebug profiling, and send the resulting files through WinCacheGrind (Windows) or KCacheGrind (Linux).
This will allow you to see a breakdown of which functions get called most, and where the time is spent. Learning to use XDebug is a must for any serious PHP developer.
Here is a seemingly good tutorial on getting started with XDebug profiling.
You will need two tools
a profiler (Google it)
i use this one at work :
http://www.nusphere.com/products/php_profiler.htm (commercial)
a load tester
check this site for more info :
http://performance-testing.org/content/performance-testing-tools
I'd recommend to use a PHP profiler. Xdebug which is both PHP debugger and profiler can help a lot. There are also other debuggers, e.g. Zend Debugger.
To analyze profiling results you could also need a special tool. I used WinCacheGrind in Windows and KCachegrind in Linux.
Profiling report shows tons of useful information e.g. which lines of the source code were called how many times and which functions took the most of the execution time.

Efficiently gathering information about the inner workings of a new PHP project. Tools? Techniques? Scripts?

I am soon to join a PHP project that has been developed over the course of several years. It's going to be huge, sparsely documented, many files, piles of code, no consitent quality level is to be expected.
How would you go about gathering as much information as possible about what is going on?
Autoloading is not be expected, at
least not extensively, so
inclued might do a good job
revealing the interdependencies.
Having phpDocumentor digest the
project files might give an idea
about which classes/methods/functions
are present.
Maybe phpCallGraph for
method/function relations.
Profiling some generic use cases with
XDebug to gain an idea about the
hierarchies and concepts.
Inspecting important log-files ...
checking out warnings, deprecated
usages, errors.
phpinfo().
Maybe extracting all comments and
process them into a html-file.
Didn't cover Unit-Tests, Databases, ....
What would you do? What are your experiences with mentioned tools to get the most out of them?
You can assume any condition necessary.
What statistical information could be useful to extract?
Has somebody experience with those tools?
EDIT from "PHP Tools for quality check":
PHP Mess Detector
Copy/Paste Detector (CPD) for PHP code by Seb. Bermann
EDIT 2 from Bryan Waters' answer:
phploc - phploc is a tool for quickly measuring the size of a PHP project.
Inspecting Apache logs and Google Analytics data to find out about the top requested URLs and then analyze what happens using XDebug profiling and a tool like KCachegrind.
See his answer for concrete techniques.
Setting up a deployment / build / CI cycle for PHP projects - suggested by Pekka
EDIT 3
Just found this PDF of a talk by Gabriele Santini - "Statistical analysis of the code - Listen to your PHP code". This is like a gold mine.
I agreee that your question does have most of the answers.
This is what I would probably do.
I would probably start with Sebastian Bergman's tools, especially phploc so you can get an idea of the scope of the mess (codebase) you are looking at. It gives you class, function counts, etc not just lines of code.
Next I would look in the apache logs or google analytics and get the top 10 most requested php url's. I'd setup XDebug with profiling and run through those top 10 requests and get the files, call tree. (You can view these with a cachegrinder tool)
Finally, I'd read through the entire execution path of 1 or two of those traces, that is most representative of the whole. I'd use my Eclipse IDE but print them out and go to town with a highlighter is valid as well.
The top 10 method might fail you if there are multiple systems cobbled together. You should see quickly with Xdebug whether the top 10 are coded similarliy are if each is a unique island.
I would look at the mysql databases and try to understand what they are all for, espacially looking at table prefixes, you may have a couple of different apps stacked on top of each other. If there are large parts of the db not touched by the top 10 you need to go hunting for the subapps. If you find other sub apps run them through the xdebug profiler and then read through one of the paths that is representative of that sub app.
Now go back and look at your scope numbers from phploc and see what percentage of the codebase (probably count classes, or functions) was untouched during your review.
You should have a basic understanding of the most often run code and and idea of how many nooks and crannies and closets for skeleton storage there are.
Perhaps you can set up a continuous integration enviroment. In this enviroment you could gather all the statistics you want.
Jenkins is a fine CI server with loads of plugins and documentation.
For checking problems which could be expected (duplicate code, potential bugs...), you could use some of those tools:
https://stackoverflow.com/questions/4202311/php-tools-for-quality-check
HTH
PS. I Have to say that your question, IMO, contains already a lot of great answers.
If you're into statistical stuff, have a look at the CRAP index (Change Risk Analysis and Predictions), which measures code quality.
There are a two-part nice introductory article:
First part
Second part
Having both built, and suffered from, huge spaghetti-y legacy PHP projects, I think there is only so much you will be able to do using analysis tools. Most of them will simply tell you that the project is of terrible quality :)
Unit Testing and source code documentation tools usually need some active contribution inside the code to produce usable results. That said, they all are surely worth trying out - I'm not familiar with phpCallGraph and Sebastian Bergmann's tools. Also, phpDocumentor may be able to make some sense out of at least parts of the code. PHPXref is also a cool tool to get an overview, here's a (slow) demo.
The best way to start could be just taking a simple task that needs to be done in the code base, and fight your way through the jungle, follow includes, try to grasp the library structure, etc. until the job is done. If you're joining a team, maybe have somebody nearby you can ask for guidance.
I would concentrate on making the exploring process as convenient as possible. Some (trivial) points include:
Use an IDE that can quickly take you to function/method, class and variable definitions
Have a debugger running
Absolutely keep everything under source control and commit every change
Have an environment that allows you to easily deploy a change for testing, and as easily switch to a different branch or roll everything back altogether. Here is a related question on how to set something like that up: Setting up a deployment / build / CI cycle for PHP projects

function to profile / performance test PHP functions?

I'm not experiencing any performance issues, however I'd like to take a look at what takes how long and how much memory cpu it uses etc.
I'd like to get a firsthand understanding of which things can be bottle necks etc and improve any code i might reuse or build upon... (perfectionist)
I'm looking to create a little function that i can call at the begining and end of each function that records:
execution time
memory used
cpu demand
any ideas?
i haven't used things like memory_get_usage(), or methods of recording time() before so would love to get some tips on their combined implementation
There are already a host of solutions made just for that, you might want to have a look at some of these:
XDEBUG EXTENSION FOR PHP
Xdebug's Profiler is a powerful tool
that gives you the ability to analyze
your PHP code and determine
bottlenecks or generally see which
parts of your code are slow and could
use a speed boost.
Other Resource:
PHP Quick Profiler
I haven't tested it a lot, but friend of mine recomended http://xdebug.org/ for profiling PHP
Try using XDebug to debug your code flow. XDebug will generate some file that tell how well your codes, you can use Kcachegrind to visualize that files.

Categories