Require constant-defining files at once or by utilisation in PHP? - php

I have about 20 files containing 10-15 times this:
define('someConstantName','stringBetween10and200Chars');
those are all critical for the app, but each constants-file is parallel to an app page.
For example, index.php will require index_constants.php so on so forth.
The question is, should I make one file of all constant-definitions together, or require different files relatively to the pages they belong to.
I'm asking in terms of speed and efficiency.
Thanks.

If you are looking for the performance (since a file with useless data will be included whereas the page needs only one) then put a single file for each page but if you looking for ease/simplicity then you can put them in a single file (if you think it will not create any sort of complexity for you). Thanks...

Related

One PHP file for multiple pages?

I'm creating a dashboard which will have different pages for different purposes. I'm thinking if wether I should have one PHP file for each page or one for all the pages?
Pros to me is:
One file with all functions to simplify work (less files - less clutter)
Cons:
Longer loading time because one large file?
I guess my major question is if there really is a major advantage/disadvantage to either?
Using a building irony, nobody says we can't pile 50 rooms one above another instead of spreading 25 on two levels!
1) Putting all files in one file is less clutter indeed, but more likely to break everything once something goes wrong at some point.
2) Depending on how many files you have to create, I would advise to separate functionality in multiple files. If one breaks, others may still work.
3) I love Ajax in that it can reduce all this "clutter issue", keeping in mind though, that everything will rely on your Ajax function being well done and securely put together
:)

PHP - One file or multiple (using require) - difference

I have one big PHP code divided into multiple files, each one is loaded with require (exactly one time for each).
I'm curious if it's faster to have one big .php file with the whole code or different parts in different files (each part is always used).
Let's say one thousand people are refreshing the page, how much does the number of files (the code is divided in) change the speed? Which way is faster and why (what does it depend on)?
Thanks for any advice.
It terms of processing it really doesn't make a difference, requiring is like copy/pasting the file into the file your requiring it from.
It also keeps your code more organised.
EDIT:
since the require function uses include path it will be slightly slower, but not noticeably even with 100000000's of viewers refreshing.

Should a very long function/series of functions be in one php file, or broken up into smaller ones?

At the moment I am writing a series of functions for fetching Dota 2 matches from the Steam API. When someone fetches their games, I have to (for my use) take a history of all of their games (lets say 3 api calls), then all the details from each of those games (so if there's 200 games, another 200 api calls). This takes a long time, and so far I'm programming all of the above to be in one php file "FetchMatchHistory.php", which is run by the user clicking a button on the web page.
Another thing that is making me feel it should be in one file, is that I imagine it is probably good practice to put all of the information (In this case, match history, match details, id's etc.) into the database all at once, so that there doesn't have to be null values in the database?
My question is whether or not having a function that takes a very long time should be in just one PHP file (should meaning, is generally considered good practice), or whether I should break the seperate functions down into smaller files. This is very context dependent, I know, so please forgive me.
Is it common to have API calls spanning several PHP files if that is what you are making? Is there a security/reliability issue with having only one file doing all the leg-work (so to speak)?
Good practice is to have a number of relevant functions grouped together in a php file that describes them, for organize them better and also for caching reasons for the parts that get updated more slowly than other.
But speaking of performance, i doubt you'll get the performance improvements you seek by just moving code through files.
Personally i had the habit to put everything in a file, consistently:
making my files fat
hard-to-update
hard-to-read
hard to find the thing i want (Ctrl+F meltdown)
wasting bandwidth uploading parts they did not need to be updated
virtually disabling caching on server
I dont know if any of the above is of any use for your App, but breaking files into their relevant files/places did my life easier.
UPDATE:
About the database practice, you're going to query only the parts you want to be updated.
I dont understand why you split that logic in files, there's not going to give you performance. Instead, what is going to give you performance is to update only the relevant parts and having tables with relevant content. Speaking of multiple tables have a lot more sense, since you could use them as pointers to the large data contained in another tables, reducing the possible waste of data having just one table.
Also, dont forget a single table has limitations; I personally try to have as few columns as possible. Adding more and more and a day you can't add more because of the row limit. There is a maximum number of columns in general, but this limit rarely ever get maxed by developer; the increased per-row content itself is going to suck out that limit.
Whether to split server side code to multiple files or keep it in a single one is an organizational issue, more than a security/reliability one...
I don't think it's more secure to keep your code in separate source files.
It's entirely a of how you prefer to organize and mantain your code base.
Usually, I separate it when I can find some kind of "categories" in my code.
Obviously, if you write OO code, the most common choice is to keep each class in a single file...

Is it better to split functions into different files?

I have a php file to handle a big amount of MySQL queries for my site and I'm expecting about hundreds to thousands users using it at a time. Will it make difference for the server reaction speed if I keep all these functions as a single file, or should I split them?
If you have a highly frequented web page, you should organize your code perfectly, because it seems to be more than a guestbook page - otherwise, the project runs out of your control. So put each function into its own file!
Only your second thought should be about performance. You could think about using opcode cache or other improvements like a static-map autoloader, a build script, that merges your php files and stuff.
But don't start to ruin your project's source code with this bad style.
Splitting functions into different files more better than writing all functions in one file
it's make including files more faster and best performance .... because you include only functions you want also you save server memory .
Also it's make you code more readable .

Which is better performance in PHP?

I generally include 1 functions file into the hader of my site, now this site is pretty high traffic and I just like to make every little thing the best that I can, so my question here is,
Is it better to include multiple smaller function type files with just the code that's needed for that page or does it really make no difference to just load it all as 1 big file, my current functions file has all the functions for my whole site, it's about 4,000 lines long and is loaded on every single page load sitewide, is that bad?
It's difficult to say. 4,000 lines isn't that large in the realms of file parsing. In terms of code management, that's starting to get on the unwieldy side, but you're not likely to see much of a measurable performance difference by breaking it up into 2, 5 or 10 files, and having pages include only the few they need (it's better coding practice, but that's a separate issue). Your differential in number-of-lines read vs. number-of-files that the parser needs to open doesn't seem large enough to warrant anything significant. My initial reaction is that this is probably not an issue you need to worry about.
On the opposite side of the coin, I worked on an enterprise-level project where some operations had an include() tree that often extended into the hundreds of files. Profiling these operations indicated that the time taken by the include() calls alone made up 2-3 seconds of a 10 second load operation (this was PHP4).
If you can install extensions on your server, you should take a look at APC (see also).
It is free, by the way ;-) ; but you must be admin of your server to install it ; so it's generally not provided on shared hosting...
It is what is called an "opcode cache".
Basically, when a PHP script is called, two things happen :
the script is "compiled" into opcodes
the opcodes are executed
APC keeps the opcodes in RAM ; so the file doesn't have to be re-compiled each time it is called -- and that's a great thing for both CPU-load and performances.
To answer the question a bit more :
4,000 lines is not that much, speaking of performances ; Open a couple of files of any big application / Framework, and you'll rapidly get to a couple thousand of lines
a really important thing to take into account is maintenability : what will be easier to work with for you and your team ?
loading many small files might imply many system calls, which are slow ; but those would probably be cached by the OS... So probably not that relevant
If you are doing even 1 database query, this one (including network round-trip between PHP server and DB server) will probably take more time than the parsing of a couple thousand lines ;-)
I think it would be better if you could split the functions file up into components that is appropriate for each page; and call for those components in the appropriate pages. Just my 2 cents!
p/s: I'm a PHP amateur and I'm trying my hands on making a PHP site; I'm not using any functions. So can you enlighten me on what functions would you need for a site?
In my experience having a large include file which gets included everywhere can actually kill performance. I worked on a browser game where we had all game rules as dynamically generated PHP (among others) and the file weighed in at around 500 KiB. It definitely affected performance and we considered generating a PHP extension instead.
However, as usual, I'd say you should do what you're doing now until it is a performance problem and then optimize as needed.
If you load a 4000 line file and use maybe 1 function that is 10 lines, then yes I would say it is inefficient. Even if you used lots of functions of a combined 1000 lines, it is still inefficient.
My suggestion would be to group related functions together and store them in separate files. That way if a page only deals with, for example, database functions you can load just your database functions file/library.
Anothe reason for splitting the functions up is maintainability. If you need to change a function you need to find it in your monalithic include file. You may also have functions that are very, very similar but don't even realise it. Sorting functions by what they do allows you to compare them and get rid of things you don't need or merge two functions into one more general purpose function.
Most of the time Disc IO is what will kill your server so I think the lesser files you fetch from disc the better. Furthermore if it is possible to install APC then the file will be stored compiled into memory which is a big win.
Generally it is better, file management wise, to break stuff down into smaller files because you only need to load the files that you actually use. But, at 4,000 lines, it probably won't make too much of a difference.
I'd suggest a solution similar to this
function inc_lib($name)
{
include("/path/to/lib".$name.".lib.php");
}
function inc_class($name)
{
include("/path/to/lib".$name.".class.php");
}

Categories