I'm developing a web application and I'm using a file called functions.php in which I stored all the functions related to my application, currently the file has about 1500 lines of code with over 30 functions.
I was wondering if it is a problem, could it slow down the process when calling functions? Should I make other files to move some of the functions there?
In most cases, the number of lines in the .php script isn't going to affect the speed of the program nearly as much as the code itself. If execution time is your number one concern, then optimizing your code should be your number one priority. Start with the functions that are called the most, and make sure the code there is as tight as possible.
Splitting the functions up into different files would technically make the script slower since the interpreter would have to do disk I/O to parse the files. But the speed hit would be infinitesimal, so I'd argue that splitting them up might save time in the long run since it'll be easier to debug and optimize if you're not always staring down a huge file with 30 functions in it.
Finally, if you do have to use a bunch of huge .php files in your app, you might want to look into using something like Zend that will compile your scripts on the server.
If you only use one or two functions per page, would it be faster to split them up into separate files and only include the ones you need? Yes, because PHP needs to read through the whole file to include it. Will it make a noticeable enough difference that it's worth splitting the file? In most cases the answer's probably no.
Related
I am creating a PHP website and it contains several sections, I was wondering is it safe to keep all of my functions in 1 file and then include it in every other file?
It would certainly make things easier for me but do you think it's a good idea? In both security aspects and speed. Because if I keep all my functions in a single page it would definitely become quite big, and I wouldn't be needing a lot of them in a lot of pages, so, wouldn't it affect my script's speed?
And do you think it's wise to keep all of them together? Aren't I just making it easier for hackers to find the core of my script? What do you suggest I should do?
Big, long functions files were (and still are, to an extent) pretty common in PHP projects.
Security of all files should certainly be a consideration, but a single functions file is no different from any other, really. Best practice is to keep it outside of your web root so that if PHP fails (it happens), your file still won't be accessible. If the server gets hacked, the location and format of your files is unlikely to make any difference.
Modern frameworks typically include hundreds of different files on every page load. The extra time this takes is barely measurable and probably not worth thinking about.
For maintainability, it's rarely a good idea to have one massive file. You should look at separating these into utility classes and autoload them as needed.
Security-wise, it is safe as long as the file is kept under the right permissions.
It is not the best practice, and I guess you should take a look at php's autoload
Security has nothing to do with the size of file.
You just need to make it inaccessible (which you can do it by .htaccess) and hidden by public (keep it outside of the web root)
And, what about the speed ?
I think it's nicer to get the files organized as specific as possible.
If there are many of tiny functions, you can organized it by its shared characteristics (maybe string functions, array functions, etc)
The time overhead is very very small and can be negligible.
And, I think maintainability is much more important than that negligible performance difference.
That's basically all my question is, if I have php pages that have 5,000-10,000 lines of code for a certain purpose, in my case image upload management (cropping and such), would it slow down the rest of my documents to include them on each page that doesn't use them? Basic logic tells me it of course would, but at the same time I'm not an expert, so I don't know if php acts differently than I may understand.
include and require statements makes PHP also compile/interpret the files that you include. That does cost some computation but in 99 % of cases it won't matter... unless your site is very popular and saving that computation time is important. If that is the case, you can solve this very easily by using so called PHP Accelerators (like XCache or APC). These can be installed along with your PHP installation and cache in RAM all the compiled opcode from your php scripts. Improvements with this solution vary between 40 and 75 %.
There will be a slight slowdown as the unused functions (extra code) needs to be parsed and it would also take extra memory. Apart from that no other effect.
I want to profile & optimize my PHP scripts regarding file i/o. So at first it is necessary to count how much i/o happens within a script and all of its includes. How do I do that, without modifying the scripts? Is there any possibility to somehow overload file-related functions and add counting to them? Or an extension at least?
I think the best thing you'll get is finding out how many (and which) files have been included (which also is some kind of file I/O).
http://php.net/manual/en/function.get-included-files.php
I don't think there's a built-in "how much file I/O did fread/fwrite do" function that would allow you to figure all that out without modifying your scripts or server.
Use vmstat and iostat to find whether you are cpu, mem or i/o bound. You can also use strace to log system calls. Use XDebug or xhprof to profile your PHP application.
So at first it is necessary to count how much i/o happens within a
script and all of its includes.
It is not necessary to count how much I/O happens within a script and its includes.
It is only necessary to pause it at random several times and each time examine the call stack. If one I/O takes twice as much time as another, you're twice as likely to see it on each pause. It doesn't matter if it's in your script or in an include. Anything that takes enough time to be worth optimizing will appear more than once.
I hope this is not a completely stupid question. I have searched quite a bit for an answer, but I can't find (or recognise) one exactly on point.
I understand that functions in PHP are not parsed until actually run. Therefore, if I have a large class with many functions, only one of which requires a large include file, will I potentially save memory if I only include the "include file" within the function (as opposed to at the top of the class file)?
I presume that, even if this would save memory, it would only do so until such time as the function was called, after which the memory would not be released until the current script stopped running?
Many Thanks,
Rob
I love this saying: "Make it work and then, if needed, make it fast." -some good programmer?
In most cases you would probably be better off focusing on good OOP structure and application design then speed. If you server is using something like Zend Optimizer having all your methods in a single file won't make any difference since it is all pre-compiled and stored in memory.(It's more complicated then this but you get the idea)
You can also load all your include files when apache starts. Then all the functions are loaded in memory. You wouldn't want to do that while developing unless you like to restart Apache every time you make a code change. But when done on production servers it can make a huge difference. And if you really want to make things fast you can write the code in C++ and load it as a module for Apache.
But in the end... do you really need that speed?
Yes it will, but be sure that the function doesn't depend on any other functions included in the parent. The memory consumption is also dependent on a couple things, from the size of the file itself to the amount of virtual memory it requires with variable setting and proper garbage collection protocols.
If the function is inside a class, it's called a method, and it might depend on its class to extend another class.
Just some things to consider. Always include the bare minimum.
Don't save memory on such cases unless you really need it, save development time. Memory is usually cheap but development/supoort time isn't. Use php opcode cacher like eAccelerator or APC, it will increase speed of execution because all files will be pre-compiled and stored in memory.
I was just reading over this thread where the pros and cons of using include_once and require_once were being debated. From that discussion (particularly Ambush Commander's answer), I've taken away the fact(?) that any sort of include in PHP is inherently expensive, since it requires the processor to parse a new file into OP codes and so on.
This got me to thinking.
I have written a small script which will "roll" a number of Javascript files into one (appending the all contents into another file), such that it can be packed to reduce HTTP requests and overall bandwidth usage.
Typically for my PHP applications, I have one "includes.php" file which is included on each page, and that then includes all the classes and other libraries which I need. (I know this isn't probably the best practise, but it works - the __autoload feature of PHP5 is making this better in any case).
Should I apply the same "rolling" technique on my PHP files?
I know of that saying about premature optimisation being evil, but let's take this question as theoretical, ok?
There is a problem with Apache/PHP on Windows which causes the application to be extremely slow when loading or even touching too many files (page which loads approx. 50-100 files may spend few seconds only with file business). This problem appears both with including/requiring and working with files (fopen, file_get_contents etc).
So if you (or more likely anybody else, due to the age of this post) will ever run your app on apache/windows, reducing the number of loaded files is absolutely necessary for you. Combine more PHP classes into one file (an automated script for it would be useful, I haven't found one yet) or be careful to not touch any unneeded file in your app.
That would depend somewhat on whether it was more work to parse several small files or to parse one big one. If you require files on an as-needed basis (not saying you necessarily should do things that way ) then presumably for some execution paths there would be considerably less compilation required than if all your code was rolled into one big PHP file that the parser had to encode the entirety of whether it was needed or not.
In keeping with the question, this is thinking aloud more than expertise on the internals of the PHP runtime, - it doesn't sound as though there is any real world benefit to getting too involved with this at all. If you run into a serious slowdown in your PHP I would be very surprised if the use of require_once turned out to be the bottleneck.
As you've said: "premature optimisation ...". Then again, if you're worried about performance, use an opcode cache like APC, which makes this problem almost disappear.
This isn't an answer to your direct question, just about your "js packing".
If you leave your javascript files alone and allow them to be included individually in the HTML source, the browser will cache those files. Then on subsequent requests when the browser requests the same javascript file, your server will return a 304 not modified header and the browser will use the cached version. However if your "packing" the javascript files together on every request, the browser will re-download the file on every page load.