xdebug, having problems with profiler output - php

Right, since watching Rasmus Lerdorf's talk on PHP performance I've been wanting to profile the ERP / Accounting application I am working on, not least because I know there are performance issues with it, profiling should highlight the major problems for me to investigate.
So downloaded xdebug and put the following few lines in my php.ini file:
zend_extension="/usr/lib/php5/20090626+lfs/xdebug.so"
xdebug.profiler_output_dir="/home/me/xdebug/profiles/"
xdebug.ptofiler_enable_trigger=On
With this I simply aim my browser as my app with &XDEBUG_PROFILE in the query string and the profiling begins. The problem is the output I am viewing with KCacheGrind doesn't include any of the functions from with my application, and no flow between entities.
When the page is executing I copied (in the terminal) the profile file to a separate file, to capture it's state throughout the profile. I loaded each of these separately into KCacheGrind and they all show the full profile of the application, all but the last one?
Can anyone tell me why the full profile isn't being output? Looking at the file sizes of my copied files it appears the first few are rather large, but the last one is significantly smaller, is xdebug messing with them after it has been captured?
Many thanks :-)
EDIT
Just to help, this is what I see when I open up one of the copied profiles (before the profile has completed), I'm sure there is much more to this.
And this is what I get from the final profile, no relationships, just a bunch of PHP functions. I want to see all the full profile.
EDIT 2
So here I am constantly running the ls -als command, the last list is the cut down version, the previous one is the last ls where the file was at it's full size.
I cannot upload the large file as it's over 3 million lines long, if it helps here is the xdebug php info section.

Right, I've actually solved the problem myself, I added this option to my php.ini file:
xdebug.profiler_append=1
This will append the data to the same filename if it exists, therefore I'll need to make sure the filename option is set correctly, but I think that has solved my problem for now.
Thanks to those that answered :-)

Related

Debug PHP Using WAMP and an IDE

I recently started web development. The course I took was to install WAMP and start developing right away. I used an atom text editor, this -combined with wamp- proved to be a very fast way to write client-side code(HTML, CSS, Javascript).
But when I started to write serverside PHP things got a little messy. I should probably explain my site's structure here.
I keep separate PHP, CSS, javascript files for every page on the client side, for the server side a have 2 different types of PHP files:
Files that only perform a specific operation on the database(For example returning "5 more answers"). These are always called by AJAX requests.
Files that load the page for the first time. These are only used when the user opens the page for the first time, they do necessary database queries and return the page. Later requests always go to the 1st type of PHP files.
Now regarding my problem. I debugged until now by printing variables to the screen with var_dump() or echoing. But this started to become too slow as the data I work with grew. I wonder if there is a way of debugging which will let me but a breakpoint in one of my PHP files. Then, when I open it on the browser, on the localhost I created using WAMP, will let me go through the PHP file step by step.
I have been dealing with this issue for 3 days, I tried to make it work with Eclipse IDE but couldn't find a way. Also, there seems to be no tutorials or Q&A on the internet regarding the issue.
Breakpoint debugging opens a whole new world, and is the natural step after var_dump() debugging. Not only does it speed up development, but it provides much more information about your code, as you can step through each line and see what values have been set at each step, and how they evolve as your program executes its code. This means you can track the entirety of the values at different stages with one run - imagine tracking all variables at each point using var_dump()!
Although choosing an IDE is a personal decision based on personal taste, i strongly recommend you try out PhpStorm. If you can get a student licence go for it.
PhpStorm has extensive documentation & tutorials on all features in the IDE, debugging is no exception:
https://www.jetbrains.com/help/phpstorm/configuring-xdebug.html
https://www.youtube.com/watch?v=GokeXqI93x8
I don't know of a specific solution to your issue. I'm not exactly sure what you're doing but as a quick tip, I find add the following snippet to the top of the file useful as it will highly error more easily rather than browser just say nope.
error_reporting(E_ALL);
ini_set('display_errors', 'On');
Hope this help you a bit.
I tried out what's recommended in comments and answers. I first tried Netbeans. To be fair it disappointed me. Download kept getting stuck at 100%, even for different versions. When I stopped downloading and went ahead to create a php project, there was missing parts I guess. I couldn't even manage to create a php project. But that might just be me not being able to do it.
Then I followed #leuquim's answer and #Alex Howansky's comment and downloaded PHPStorm. And I got it to work in no more than 20 minutes. I downloaded it with a student's licence. For people who want to use PHPStorm with WAMP here's a Youtube tutorial:
https://www.youtube.com/watch?v=CxX4vnZFbZU
One thing to note in the video is that, maker of the video chooses PHP Web Application in the Run Configurations. That has been changed to PHP Web Page.

phpdoc does not update my documentation

phpDocumentor v1.4.4
Fedora 24
Command line: phpdoc -d ./docsrc -t ./output
I am running phpDocumentor on Fedora 24 and have successfully generated documentation for my project one time.
I added a docblock to a function, and ran phpdoc again. But the output has not been updated. I verified the time stamps of the files and they have been regenerated, but do not reflect the changes.
I subsequently made numerous changes, and reran phpdoc after each change, but the generated documentation does not update.
I erased all the output files, renamed the directory of the input files, in short have done all I can to persuade phpdoc to generate new documentation that reflects the changes to my php files to no avail.
It would seem that phpdoc is caching the output somewhere but I cannot find where. I searched every path on my disk containing phpdoc then searched for the word "cache" in each path but it does not occur.
I tried changing the template with the --template directive but it does not recognise this directive.
I have tried using the --force directive but it does not recognise this directive.
Can someone enlighten me?
Cheers,
Peter
This sounds like one of those times where I would just walk through the process from the beginning:
Am I modifying source in the ./docsrc directory tree? Verify by opening the source member in vi/vim/nano/some-other-editor just to be sure the source has changed.
Have I modified the source using the correct syntax? (Please post some code that shows documentation that isn't being updated)
Modify documentation in another file with a simple change and see if that simple change appears when I regenerate my documentation.
Am I explicitly --ignore-ing the file or directory I'm expecting to change? (You don't appear to be)
Do I have a phpdoc.xml or phpdoc.dist.xml file with an <ignore> directive? details
Do I have the necessary permissions to create/update files in the ./output directory?
After I've executed phpdoc -d ./docsrc -t ./output do I see the expected change when using vi/vim/nano/some-other-editor?
Is my browser caching previous versions of the documentation? (I know you've already ruled this out Peter, I'm just trying to make my answer complete)
This is EXACTLY one reason why I created PHPFUI/InstaDoc! The problem with most documentation is that it is static. While that is great for libraries that don't change, if you want to document your own code, guess what? It tends to change every day! With InstaDoc, you can see the documentation instantly on your local machine before you even check it in. InstaDoc creates the documentation when you request the page. It is hands down the fastest documentation system out there. Most documentation systems create static pages and brag about how fast they can create the documentation. But guess what? Who cares? What you want is to see the documentation of your current code base right now. Turns out it only takes a few seconds to scan through all the files of the libraries you are using. InstaDoc caches that information, so you only have a long scan (and then only seconds) the first time, or when ever you add a new library.
Once you have a library scanned, the documentation comes up instantly, since it uses PHP reflection classes to read the file and display the documentation. So that file you just modified, it is completely 100% documented. Don't like the comments, change them, refresh the page. See an issue, correct it, refresh the page. Notice something could be better? Refresh the page. Want to check out the docs on a PR? Easy, just delete the cached index and refresh the page.
InstaDoc is open source and still young. Check it out and submit comments or PR's if it does not meet your needs, but it is the future of documentation. It will also generate static files for high volume sites, but the most important feature is that it gives you an instant reflection of your just edited code, and that is what makes it awesome.

PHP un-editable pdf/file export option

I am developing an application in the Kohana PHP framework that assesses performance. The end result of the process is a webpage listing the overall scoring and a color coded list of divs and results.
The original idea was to have the option to save this as a non-editable PDF file and email the user. After further research I have found this to be non as straight forward as I hoped.
The best solution seemed to be installing the unix application wkhtmltopdf but as the destination is shared hosting I am unable to install this on the server.
My question is, what's the best option to save a non editable review of the assessment to the user?
Thank you for help with this.
I guess the only way to generate a snapshot, or review how you call it, is by storing it on the server side and only grant access via a read only protocol. So basically by offering it as a 'web page'.
Still everyone can save and modify the markup. But that is the case for every file you generate, regardless of the type of file. Ok, maybe except DRM infected files. But you don't want to do that, trust me.
Oh, and you could also print the files. Printouts are pretty hard to be edited. Though even that is not impossible...
I found a PHP version that is pre-built as a Kohana Module - github.com/ryross/pdfview

Aptana Studio 3 with PHP - constant indexing

I'm using Aptana Studio 3 with several big PHP projects (10.000+ files) and it suffers from very slow indexing of PHP files.... which takes 10-20 minutes to complete and starts every time at the startup of Aptana, and also sometimes at random moments, for example when synchronizing with SVN...
In the progress view I get multiple 'Indexing new PHP Modules' items.
All the time it is doing this Aptana is unusably slow. I don't get why this indexing starts over and over again on files that aren't new at all!
I already turned off automatic refreshes and automatic build. If I exclude 'PHP' from the 'Project Natures' in the properties of the projects, the indexing stops, but then I don't have code completion in PHP files.
I cleaned all projects, created a new workspace, etc. and nothing helps... This happens on multiple pc's (Windows) so I guess more people get this behaviour.
Any possible solutions?
UPDATE
I added the folder of my workspace to the 'ignore'-folders of my virus scanner (Microsoft Security Essentials). At first this seemed to work, but then the indexing started again...
Seems like you did the right steps to try and resolve it, and it also seems we should have a ticket for that, so I created one at https://jira.appcelerator.org/browse/APSTUD-4500 (please add yourself as a 'watcher').
One more thing to try is to break down a big project into a few smaller ones (whenever possible, of course). The indexer creates a binary index file for each project, and this file size is proportional to amount of classes, functions, variables and constants you have in your project. In case, for some reason (e.g. a bug) this file gets corrupted, a re-index will happen, so having multiple smaller projects may help with that. Again... just an idea.

Deleting large chunks of PHP effectively

I've just inherited a project, and been told that an entire folder, "includes/" needs to be removed due to licensing issues -- We don't have the right to redistribute the files in that folder, so we need to cut our dependencies on them, and fix whatever breaks. I've been told "Less than 5% of the lines in that folder are ever even called by our program", but I have no way of verifying this.
There are about 50 files in the folder, each with a couple hundred lines of code. There is no unit testing currently in place. There's one master file, include.php, that require()s all 49 other files, so I can't just grep for any file doing import() on includes/.*.
This is about as much detail as I've really figured out at this point. I spent all last week reading through the files in the includes/ folder, and it won't be hard to rewrite any of this, but I'm having trouble deciding where to start. I tried deleting the folder and slowly fixing things that break, but I'm afraid that this route will cause me to miss some crucial functions in my rewrite.
Can anyone point me in a direction to get started? Are there tools that will simplify this process? I'm looking at xdebug right now, but I'm not sure exactly how I'd use it for this.
You may want to search for "php code coverage." That should help you figure out what code is used. For instance, this appears like it might help:
http://www.xdebug.org/docs/code_coverage
Your initial approach isn't bad at all. It's certainly a reasonable place to start:
delete that code that isn't allowed.
try to run what's left.
if things break: create a stub for a method that is now missing, and set it to return some sensible "default" value for now.
goto 2.
Then, itemize all the things that were missing, and make a sensible schedule to re-implement each thing.
I would start by grepping for files that reference include.php. Check through them if they're manageable, one by one. Then I'd grep for each of the functions in the /include/*php files. See if they're called anywhere, find 'em, replace 'em.
Because PHP is so dynamically typed, I don't think there's going to be a tool for this.
(Eagerly awaiting someone to prove me wrong because I have similar tasks all the time... )
See SD PHP Test Coverage Tool. It will provide a visual view of what code actually executes, as well as a report on what parts of files are used (including "no parts", which is your cue that
the code is a likely candidate to delete).
It doesn't require any hand-modifications of your code, or any unit tests to run it.
To answer my own question, I wound up using xdebug profiler to do the job, as I was initially investigating (after a friend's suggestion prompted me to take a second look).
In my /etc/php5/apache2/conf.d/xdebug.ini (on ubuntu 9.10), I set xdebug.profiler_enable=1 and xdebug.profiler_output_dir=/var/log/xdebug/, then loaded up the resulting cachegrind files with KCacheGrind and just ran a search on filenames for "includes/".
Now I have a mountain of work ahead of me to remove all this, but at least I've got a good overview of what I'll be modifying!

Categories