I'm using Aptana Studio 3 with several big PHP projects (10.000+ files) and it suffers from very slow indexing of PHP files.... which takes 10-20 minutes to complete and starts every time at the startup of Aptana, and also sometimes at random moments, for example when synchronizing with SVN...
In the progress view I get multiple 'Indexing new PHP Modules' items.
All the time it is doing this Aptana is unusably slow. I don't get why this indexing starts over and over again on files that aren't new at all!
I already turned off automatic refreshes and automatic build. If I exclude 'PHP' from the 'Project Natures' in the properties of the projects, the indexing stops, but then I don't have code completion in PHP files.
I cleaned all projects, created a new workspace, etc. and nothing helps... This happens on multiple pc's (Windows) so I guess more people get this behaviour.
Any possible solutions?
UPDATE
I added the folder of my workspace to the 'ignore'-folders of my virus scanner (Microsoft Security Essentials). At first this seemed to work, but then the indexing started again...
Seems like you did the right steps to try and resolve it, and it also seems we should have a ticket for that, so I created one at https://jira.appcelerator.org/browse/APSTUD-4500 (please add yourself as a 'watcher').
One more thing to try is to break down a big project into a few smaller ones (whenever possible, of course). The indexer creates a binary index file for each project, and this file size is proportional to amount of classes, functions, variables and constants you have in your project. In case, for some reason (e.g. a bug) this file gets corrupted, a re-index will happen, so having multiple smaller projects may help with that. Again... just an idea.
Related
Even if there seem to exist a few duplicate questions, I think this one is unique. I'm not asking if there are any limits, it's only about performance drawbacks in context of Apache. Or unix file system in general.
Lets say if I request a file from an Apache server
http://example.com/media/example.jpg
does it matter how many files there are in the same directory "media"?
The reason I'm asking is that my PHP application generates images on the fly.
Once created, it places it at the same location the PHP script would trigger due to ModRewrite. If the file exists, Apache will skip the whole PHP execution and directly serve the static image instead. Some kind of gateway cache if you want to call it that way.
Apache has basically two things to do:
Check if the file exists
Serve the file or forward the request to PHP
Till now, I have about 25.000 files with about 8 GB in this single directory. I expect it to grow at least 10 times in the next years.
While I don't face any issues managing these files, I have the slight feeling that it keeps getting slower when requesting them via HTTP. So I wondered if this is really what happens or if it's just my subjective impression.
Most file systems based on the Berkeley FFS will degrade in performance with large numbers of files in one directory due to multiple levels of indirection.
I don't know about other file systems like HFS or NTFS, but my suspicion is that they may well suffer from the same issue.
I once had to deal with a similar issue and ended up using a map for the files.
I think it was something like md5 myfilename-00001 yielding (for example): e5948ba174d28e80886a48336dcdf4a4 which I then put into a file named e5/94/8ba174d28e80886a48336dcdf4a4. Then a map file mapped 'myfilename-00001' to 'e5/94/8ba174d28e80886a48336dcdf4a4'. This not-quite-elegant solution worked for my purposes and it only took a little bit of code.
Me and my friend are in different countries have been developing a LAMP web app for several weeks. All these times we have been sharing source code over ftp. In this way php files become messy. I have heard about CVS, and have been reading about it. But I still cannot figure out how it works exactly.
How does the CVS could help me in this matter ?
I would be much appreciated for someone who point me in the right direction.
Ok here comes a very simple explanation of VCS. After using it for a while you'll laugh at the explanation but for now I guess this should help you.
What are the problems of your current ftp file sharing?
If 2 people upload the same file one of the files will get overwritten
After uploading it you'll only see who changed the file (the last time) but not where it got changed
You can't provide information about the changes (despite putting comments in the files itself)
You can't go back in time, once uploaded old files are lost
With version control you can solve these problems:
Files get either merged into one new file, or get overwritten but the old file will still be stored to roll back if needed
You can see who made which changes when
You can provide comments when you "upload" your files about what got changed (without storing these comments inside files)
You can always go back in time and restore old "uploads"/changes
You can also create small side projects by branching. This basically let's you split your project in smaller pieces and work on them separately.
So at the beginning of your work you usually get your local sources up-to-date by getting all the changes that got made. Then you do your work and afterwards you update the online version with your changes so that other developers can pull these changes and continue to work on them or integrate these changes into their current changes.
How to implement this sorcery?
You could google for "how to implement git" or "how to implement svn" but I would recommend you to use an online service as a beginner. Here is a list of services: https://git.wiki.kernel.org/index.php/GitHosting
My personal preference for closed source projects with a low number of developers is https://bitbucket.org/. You get a small wiki page and bug tracking tool provided with some of the services. If you want to use bitbucket, here is the very easy to understand documentation: https://confluence.atlassian.com/display/BITBUCKET/Bitbucket+101
Important to know:
Soon you'll learn that you don't upload files as I've written multiple times but rather change lines of code. You also don't upload them you "commit" them.
While cvs could help, not many developers will recommend using it for new projects. It has largely been replaced with Subversion (svn), but even that is falling out of favour. Many projects these days use distributed version control with git or Mercurial (hg).
A good introduction to git can be found in the free online book Pro Git.
In any case, these things are all version control systems. They help to synchronize the code between developers, and also let you track
who changed code,
when it was changed,
why it was changed, and
how it was changed.
This is very important on projects with multiple developers, but there is value in using such a system even when working on your own.
I am doing some tests (lamp):
Basically I have 2 version of my custom framework.
A normal version, that includes ~20 files.
A lite version that has everything inside one single big file.
Using my lite version more and more i am seeing a time decrease for the load time. ie, from 0.01 of the normal to 0.005 of the lite version.
Let's consider just the "include" part. I always thought PHP would store the included .php files in memory so the file system doesn't have to retrieve them at every request.
Do you think condensing every classes/functions in one big file it's worth the "chaos" ?
Or there is a setting to tell PHP to store in memory the require php files?
Thanks
(php5.3.x, apache2.x, debian 6 on a dedicated server)
Don't cripple your development by mushing everything up in one file.
A speed up of 5ms is nothing compared to the pain you will feel maintaining such a beast.
To put it another way, a single incorrect index in your database can give you orders of magnitude more slowdown.
Your page would load faster using the "normal" version and omitting one 2kb image.
Don't do it, really just don't.
Or you can do this:
Leave the code as it is (located in
many different files)
Combine them in one file when you are ready to upload it to the production server
Here's what i use:
cat js/* > all.js
yuicompressor all.js -o all.min.js
First i combine them into a single file and then i minify them with the yui compressor.
Right, since watching Rasmus Lerdorf's talk on PHP performance I've been wanting to profile the ERP / Accounting application I am working on, not least because I know there are performance issues with it, profiling should highlight the major problems for me to investigate.
So downloaded xdebug and put the following few lines in my php.ini file:
zend_extension="/usr/lib/php5/20090626+lfs/xdebug.so"
xdebug.profiler_output_dir="/home/me/xdebug/profiles/"
xdebug.ptofiler_enable_trigger=On
With this I simply aim my browser as my app with &XDEBUG_PROFILE in the query string and the profiling begins. The problem is the output I am viewing with KCacheGrind doesn't include any of the functions from with my application, and no flow between entities.
When the page is executing I copied (in the terminal) the profile file to a separate file, to capture it's state throughout the profile. I loaded each of these separately into KCacheGrind and they all show the full profile of the application, all but the last one?
Can anyone tell me why the full profile isn't being output? Looking at the file sizes of my copied files it appears the first few are rather large, but the last one is significantly smaller, is xdebug messing with them after it has been captured?
Many thanks :-)
EDIT
Just to help, this is what I see when I open up one of the copied profiles (before the profile has completed), I'm sure there is much more to this.
And this is what I get from the final profile, no relationships, just a bunch of PHP functions. I want to see all the full profile.
EDIT 2
So here I am constantly running the ls -als command, the last list is the cut down version, the previous one is the last ls where the file was at it's full size.
I cannot upload the large file as it's over 3 million lines long, if it helps here is the xdebug php info section.
Right, I've actually solved the problem myself, I added this option to my php.ini file:
xdebug.profiler_append=1
This will append the data to the same filename if it exists, therefore I'll need to make sure the filename option is set correctly, but I think that has solved my problem for now.
Thanks to those that answered :-)
I have a problem with maintenance of my php based website. My website is built on the Zend Framework. When I wish to upload a new copy or version online - during the time of upload especially when crucial files like models and controllers are being uploaded and rewritten - the site won't run understandably.
Is there a way to upload a website without having to go through this issue?
My updates are really quite regular. Let's say like once or twice a week in this case.
You can make use of the fact that renaming directories is quick and easy even through FTP. What I usually do is:
Have two directories, website_live and website_upload
website_live contains the live website (obviously)
Upload contents to website_upload
Rename website_live to website_old (or whatever)
Rename website_upload to website_live
done! Downtime less than two seconds if you rename quickly.
It gets a bit more complex if you have uploaded content in the old version (e.g. from a CMS) that you need to transfer to the new one. It's cumbersome to do manually every time, but basically, it's just simple rename operations too (renaming works effortlessly in FTP as well).
This is a task that can be automated quite nicely using a simple deployment script. If you're on Linux, setting up a shell script for this is easy. On Windows, a very nice tool I've worked with to do automated FTP synchronizing, renaming and error handling - even with non-technical people starting the process - is ScriptFTP. It comes with a good scripting language, and good documentation. It's not free, though.
If you're looking to get into hard-core automated PHP deployment, I've been doing some research in that field recently. Maybe the answers to my recent bounty question can give you inspiration.